CN110646350B - Product classification method, device, computing equipment and computer storage medium - Google Patents

Product classification method, device, computing equipment and computer storage medium Download PDF

Info

Publication number
CN110646350B
CN110646350B CN201910804795.4A CN201910804795A CN110646350B CN 110646350 B CN110646350 B CN 110646350B CN 201910804795 A CN201910804795 A CN 201910804795A CN 110646350 B CN110646350 B CN 110646350B
Authority
CN
China
Prior art keywords
data
spectral feature
feature data
spectrum
product
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910804795.4A
Other languages
Chinese (zh)
Other versions
CN110646350A (en
Inventor
尹海波
金欢欢
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Shuliantianxia Intelligent Technology Co Ltd
Original Assignee
Shenzhen Shuliantianxia Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Shuliantianxia Intelligent Technology Co Ltd filed Critical Shenzhen Shuliantianxia Intelligent Technology Co Ltd
Priority to CN201910804795.4A priority Critical patent/CN110646350B/en
Publication of CN110646350A publication Critical patent/CN110646350A/en
Application granted granted Critical
Publication of CN110646350B publication Critical patent/CN110646350B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/31Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/31Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry
    • G01N21/35Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry using infrared light
    • G01N21/3563Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry using infrared light for analysing solids; Preparation of samples therefor
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N33/00Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
    • G01N33/02Food
    • G01N33/025Fruits or vegetables
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Health & Medical Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biochemistry (AREA)
  • Analytical Chemistry (AREA)
  • Immunology (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Software Systems (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Food Science & Technology (AREA)
  • Mathematical Physics (AREA)
  • Computing Systems (AREA)
  • Medical Informatics (AREA)
  • Medicinal Chemistry (AREA)
  • Image Analysis (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The embodiment of the invention relates to the technical field of machine learning, and discloses a product classification method, which comprises the following steps: acquiring spectrum data of a product to be classified; averagely dividing the spectrum data of the product to be classified into a plurality of groups of spectrum characteristic data, wherein each group of spectrum characteristic data contains the same spectrum data, and one group of spectrum characteristic data corresponds to one time step; inputting the multiple groups of spectral feature data into a classification model to determine the dependency relationship among the multiple groups of spectral feature data based on the time step, and obtaining the types of the products to be classified based on the dependency relationship, wherein the classification model is obtained by training a time-cyclic neural network model through multiple groups of training data. Through the mode, the embodiment of the invention realizes the classification of the products to be classified.

Description

Product classification method, device, computing equipment and computer storage medium
Technical Field
The embodiment of the invention relates to the technical field of machine learning, in particular to a product classification method, a device, computing equipment and a computer storage medium.
Background
With the improvement of the living standard of people, the quality requirements for various products, such as fruits, vegetables and the like, are also increasing. Quality inspection and classification of products has become an important step in the standardized process flow for various products.
Currently, the method for classifying products is mainly performed manually.
Disclosure of Invention
In view of the foregoing, embodiments of the present invention provide a product classification method, apparatus, computing device, and computer storage medium that overcomes or at least partially solves the foregoing problems.
According to an aspect of an embodiment of the present invention, there is provided a product classification method, the method including:
acquiring spectrum data of a product to be classified;
averagely dividing the spectrum data of the product to be classified into a plurality of groups of spectrum characteristic data, wherein each group of spectrum characteristic data contains the same spectrum data, and one group of spectrum characteristic data corresponds to one time step;
inputting the multiple groups of spectral feature data into a classification model to determine the dependency relationship among the multiple groups of spectral feature data based on the time step, and obtaining the types of the products to be classified based on the dependency relationship, wherein the classification model is obtained by training a time-cyclic neural network model through multiple groups of training data.
In an alternative way, before acquiring the spectral data of the product to be classified, the method further comprises:
acquiring multiple sets of training data, each set of the multiple sets of training data comprising: a plurality of sets of spectral feature data of the sample product and identification information for identifying the type of the sample product;
And training the Bi-directional threshold circulation unit Bi-GRU model according to the multiple sets of training data to obtain the classification model.
In an alternative way, acquiring multiple sets of training data includes:
acquiring spectrum data of all sample products;
dividing the spectrum data of each sample product into a plurality of groups of spectrum characteristic data on average, wherein the spectrum characteristic data of each group of spectrum characteristic data contain the same number of spectrum data so as to obtain a plurality of groups of spectrum characteristic data of each sample product;
marking each sample product to obtain the identification information of each sample product, wherein the identification information of the sample products of the same kind is the same, and the identification information of the sample products of different kinds is different;
and taking a plurality of groups of spectral characteristic data and identification information of one sample product as one group of training data to obtain the plurality of groups of training data.
In an alternative manner, the Bi-GRU model includes two Bi-GRU layers for learning the dependency relationship between the sets of spectral feature data of each sample product, and one fully connected layer for outputting the classification result.
In an alternative manner, training the Bi-GRU model according to the multiple sets of training data to obtain a classification model, including:
Inputting multiple sets of spectral feature data of the sample product into a first Bi-GRU layer;
learning a dependency relationship among a plurality of groups of spectral feature data of each sample product through a first Bi-GRU layer so as to output a first external state of each sample product;
continuously learning the dependency relationship among the multiple groups of spectral feature data of each sample product according to the first external state through a second Bi-GRU layer so as to output a second external state of each sample product;
weighting the second external state through the full connection layer to obtain a weighted result of each sample product;
the weighted results are classified and maximized through a normalized exponential function softmax classifier, and output results corresponding to each sample product are obtained;
calculating a loss function value according to the output result and the identification information;
updating the weight of the Bi-GRU model according to the loss function value until the loss function value is minimum;
and taking the Bi-GRU model with the smallest loss function value as the classification model.
In an alternative manner, the first Bi-GRU layer includes a forward GRU unit and a reverse GRU unit, the forward GRU unit and the reverse GRU unit being independent of each other;
The learning, by the first Bi-GRU layer, the dependency relationship between the plurality of sets of spectral feature data of each sample product to output a first external state of each sample product includes:
the forward dependency relationship among multiple groups of spectral feature data of each sample product is learned through the forward GRU unit, and the forward external state of each sample product is output;
the reverse dependency relationship among multiple groups of spectral characteristic data of each sample product is learned through the reverse GRU unit, and the reverse external state of each sample product is output;
and combining the forward external state and the reverse external state to obtain a first external state of each sample product.
In an optional manner, the forward GRU unit includes a reset gate and an update gate, and the forward GRU unit learns forward dependency relationships among the multiple sets of spectral feature data of each sample product, and outputs a forward external state of each sample product, including:
according to the memory information of the first spectral feature data and the second spectral feature data recorded by the forward GRU unit, calculating gating signals of the reset gate and the update gate, wherein the first spectral feature data and the second spectral feature data are spectral feature data of the same sample product, and the first spectral feature data are the next group of spectral feature data of the second spectral feature data;
Determining the retention amount of memory information of second spectrum characteristic data according to the gating signal of the reset gate and the first spectrum characteristic data;
determining memory information of the first spectral feature data according to the reserved quantity and a gating signal of an updating gate;
and outputting the memory information of the last set of spectral feature data of each sample product as the forward external state.
In an alternative manner, calculating the reset gate and calculating the gate control signals of the reset gate and the update gate according to the following formula based on the first spectral feature data and the memory information of the second spectral feature data recorded by the forward GRU unit includes: according to the first spectral feature data and the memory information of the second spectral feature data recorded by the GRU neuron, the gating signals of the reset gate and the update gate are calculated according to the following formula:
r t =σ(W r x t +U r h t-1 )
z t =σ(W z x t +U z h t-1 )
wherein r is t And z t Representing the gating signal of the reset gate and the gating signal of the update gate, respectively, sigma represents a sigmoid function, W r 、U r And W is z 、U z Respectively representing a weight matrix of the reset gate and a weight matrix of the update gate, h t-1 Memory information, x, representing the second spectral feature data t Representing the first spectral feature data.
In an alternative manner, determining the retention of the memory information of the second spectral feature data according to the gating signal of the reset gate and the first spectral feature data includes:
according to the gating signal of the reset gate and the first spectrum characteristic data, the retention of the memory information of the second spectrum characteristic data is determined according to the following formula:
Figure BDA0002183321450000041
wherein h is t ' means a retention amount, h, of memory information of the second spectral feature data t-1 Memory information representing the second spectral feature data, W and U representing a weight matrix of the forward GRU elements.
In an alternative manner, determining the memory information of the first spectral feature data according to the reserved quantity and the gate control signal of the update gate includes:
and determining the memory information of the first spectral feature data according to the reserved quantity and the gating signal of the updating gate by the following formula:
Figure BDA0002183321450000042
wherein h is t Memory information, z, representing the first spectral data t A gating signal h representing an update gate t-1 Representation ofMemory information of the second spectral feature data, h t ' represents a retention of memory information of the second spectral feature data.
According to another aspect of an embodiment of the present invention, there is provided a product classification apparatus including:
The acquisition module is used for acquiring spectrum data of the product to be classified;
the splitting module is used for averagely splitting the spectrum data of the product to be classified into a plurality of groups of spectrum characteristic data, wherein each group of spectrum characteristic data contains the same spectrum data, and one group of spectrum characteristic data corresponds to one time step;
the input module is used for inputting the multiple groups of spectral feature data into a classification model so as to determine the dependency relationship among the multiple groups of spectral feature data based on the time step, and obtaining the type of the product to be classified based on the dependency relationship, wherein the classification model is obtained by training a time-circulating neural network model through multiple groups of training data.
According to another aspect of an embodiment of the present invention, there is provided a computing device including: the device comprises a processor, a memory, a communication interface and a communication bus, wherein the processor, the memory and the communication interface complete communication with each other through the communication bus;
the memory is used for storing at least one executable instruction, and the executable instruction enables the processor to execute the operation corresponding to the product classification method.
According to still another aspect of the embodiments of the present invention, there is provided a computer storage medium having at least one executable instruction stored therein, the executable instruction causing the processor to perform operations corresponding to the above-mentioned one product classification method.
According to the embodiment of the invention, the obtained spectral data of the product to be classified is segmented to obtain a plurality of groups of spectral feature data of the product to be classified, the plurality of groups of spectral feature data are input into the classification model to obtain the type of the product to be classified, and the classification model is obtained by training the time-cyclic neural network model through a plurality of groups of training data, so that the classification model comprises the dependency relations among the plurality of groups of spectral feature data and the product classification results corresponding to the plurality of groups of spectral feature data with different dependency relations, and therefore, the product to be classified can be accurately classified according to the classification model.
The foregoing description is only an overview of the technical solutions of the embodiments of the present invention, and may be implemented according to the content of the specification, so that the technical means of the embodiments of the present invention can be more clearly understood, and the following specific embodiments of the present invention are given for clarity and understanding.
Drawings
Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the invention. Also, like reference numerals are used to designate like parts throughout the figures. In the drawings:
FIG. 1 is a flow chart of a method for classifying products according to a first embodiment of the present invention;
FIG. 2 is a flow chart of a method for classifying products according to a second embodiment of the present invention;
fig. 3 is a schematic structural diagram of a Bi-GRU layer in a product classification method according to a second embodiment of the present invention;
FIG. 3a is a schematic diagram showing a structure of a forward GRU unit in a product classification method according to a second embodiment of the invention;
FIG. 4 is a functional block diagram of a product sorting apparatus according to a third embodiment of the present invention;
fig. 5 shows a schematic structural diagram of a computing device according to a fourth embodiment of the present invention.
Detailed Description
Exemplary embodiments of the present invention will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present invention are shown in the drawings, it should be understood that the present invention may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art.
The embodiment of the invention is suitable for application scenes for classifying products. Specifically, the embodiment of the invention can be applied to application scenes for classifying different types of products, such as classifying different fruits to identify the types of various fruits (such as apples, bananas and the like); the embodiment of the invention can also be applied to the application scene of further classifying the same kind of products, such as apples in the same kind to identify the specific apple category (such as green apples, red apples and the like). Not limited to the application scenarios listed above.
The general principle of the embodiment of the invention is as follows: based on the fact that when the laser irradiates the products, spectrum data generated by diffuse reflection of different products are different, the spectrum data are subjected to feature recognition and distinction, the spectrum feature data corresponding to each product are learned by using a time-cyclic neural network model to learn, the dependency relationship among the spectrum feature data of the different products is determined, so that a classification model for classifying the products based on the spectrum feature data of the products is obtained, the products can be classified by the classification model obtained through training, and the product classification efficiency is improved; the classification model obtained through training can be used for determining the dependency relationship among multiple groups of spectral feature data of the product and product classification results corresponding to multiple groups of spectral feature data of different dependency relationships, and compared with the classification based on single features or multiple mutually independent features, the classification method is more reliable, and therefore the accuracy of product classification can be improved.
The technical scheme of the embodiment of the invention is specifically described below.
Fig. 1 shows a flow chart of a product classification method according to a first embodiment of the invention. As shown in fig. 1, the method comprises the steps of:
step 110: and acquiring spectrum data of the product to be classified.
The products to be classified in this step may be different kinds of products of the same kind, for example apples, bananas and oranges in fruits; but also different kinds of products, e.g. fruits, vegetables; or different kinds of products in different categories, for example apples and eggplants. And (3) transmitting electromagnetic waves to the product to be classified by using an electromagnetic wave transmitter with a certain wavelength range, absorbing the electromagnetic waves with partial wavelengths by the product to be classified after the electromagnetic waves contact the product to be classified, and determining the spectrum of the electromagnetic waves absorbed by the product to be classified by using a spectrum acquisition device to obtain spectrum data of the product to be classified. It should be noted that, the spectra of the electromagnetic waves absorbed by different products are different, so that the spectrum data obtained by the electromagnetic waves absorbed by the products can effectively distinguish the different products. The electromagnetic waves used in the embodiments of the present invention are electromagnetic waves having wavelengths between microwave and visible light, for example, infrared light. When infrared light is used for irradiating the products to be classified, the spectrum data of the products to be classified are collected through an infrared spectrum detector. The number of the collected spectrum data is related to the frequency collected by the spectrum collecting device, for example, the electromagnetic wave with the wavelength range of 900-1700nm is emitted to the product to be classified, the time required for receiving all the electromagnetic waves is 228 milliseconds, and the spectrum collecting device collects 228 spectrum data in total when the frequency is 1 each millisecond.
Step 120: and averagely dividing the spectrum data of the product to be classified into a plurality of groups of spectrum characteristic data, wherein each group of spectrum characteristic data contains the same spectrum data, and one group of spectrum characteristic data corresponds to one time step.
In this step, the object of dividing the spectral data is to generate time steps to learn the dependency between the time steps by means of a time-cycled neural network. The time step is a basis for determining the dependency relationship between the plurality of groups of spectral feature data, and is equivalent to the time dimension of the spectral feature data, namely after the spectral feature data are segmented into the plurality of groups of spectral feature data, the spectral feature data are increased by one-dimensional time dimension compared with the dimension of the spectral data. The spectral data are divided into a plurality of groups of spectral feature data on average, each group of spectral feature data contains the same number of spectral data, and each group of spectral feature data corresponds to one time step. The embodiment of the invention does not limit the number of the divided spectral data, and each set of the divided spectral data is a set of spectral feature data corresponding to one time step. The number of the divided parts of the spectrum data may correspond to the number of neurons in the classification model in the embodiment of the present invention, that is, the number of the divided parts of the spectrum data is equal to the number of neurons in the classification model.
For example, the spectrum data is a matrix of 10 rows and 228 columns, which indicates that the number of products to be classified is one, each product to be classified contains 228 spectrum data, the spectrum data is divided into 12 parts according to the number of spectrum data, 12 groups of spectrum characteristic data are obtained, each group of spectrum characteristic data contains 19 spectrum data, 12 time steps are generated according to the division sequence of the spectrum characteristic data, and the spectrum characteristic data between each time step has a dependency relationship. In the specific implementation process, the sequence of the collected spectrum data is arranged according to the wavelength from small to large, and when the spectrum data is divided, the spectrum data is divided according to the sequence of the spectrum data, and the arrangement sequence is unchanged.
Step 130: and inputting the multiple groups of spectral feature data into a classification model to determine the dependency relationship among the multiple groups of spectral feature data based on time steps, and obtaining the types of products to be classified based on the dependency relationship.
In this step, the classification model is obtained by training a time-cycled neural network model with multiple sets of training data. The time-loop neural network is used for learning the dependency relationship among multiple groups of spectral feature data, the dependency relationship among multiple groups of spectral feature data corresponding to different products is different, and a one-to-one correspondence relationship exists between the dependency relationship and the types of the products, so that after the dependency relationship of the products is determined, the types of the products are also determined uniquely. The number of neurons of the time-loop neural network receiving the spectral feature data is consistent with the number of sets of the spectral feature data, and dividing the spectral data into the number of sets of the spectral feature data in step 120 determines the number of neurons of the time-loop neural network receiving the spectral feature data, for example, dividing the spectral data into 12 sets of the spectral feature data, so that the number of neurons of the time-loop neural network receiving the spectral feature data is 12, and each neuron receives a set of the spectral feature data. In the implementation process, the time-loop neural network may be a Long-short-term memory neural network (Long-Short Term Memory, LSTM), a bidirectional LSTM, a threshold-loop-unit neural network (Gated Recurrent Unit, GRU), a bidirectional threshold-loop-unit (Bidrectional Gated Recurrent Unit, bi-GRU), or the like, which can learn the dependency relationship between the features.
According to the embodiment of the invention, the obtained spectral data of the product to be classified is segmented to obtain a plurality of groups of spectral feature data of the product to be classified, the plurality of groups of spectral feature data are input into the classification model to obtain the type of the product to be classified, and the classification model is obtained by training the time-cyclic neural network model through a plurality of groups of training data, so that the classification model comprises the dependency relations among the plurality of groups of spectral feature data and the product classification results corresponding to the plurality of groups of spectral feature data with different dependency relations, and therefore, the product to be classified can be accurately classified according to the classification model. In addition, compared with an algorithm which can only classify products according to the unordered combination of a plurality of features through a general machine learning algorithm, the time-cyclic neural network used in the embodiment of the invention takes the dependency relationship among a plurality of groups of spectral feature data as the basis of classification, and each group of spectral feature data is mutually connected, so that compared with a single feature or a plurality of mutually independent features, the time-cyclic neural network is more reliable; in addition, when a general machine learning algorithm is used for training a classification model, the training effect is limited by the number of training samples and the feature dimension of training data, and a better training effect can be achieved only under the condition that the number of samples is close to the feature dimension.
Fig. 2 shows a flow chart of a product classification method according to a second embodiment of the invention, as shown in fig. 2, comprising the steps of:
step 210: multiple sets of training data are acquired.
In this step, the plurality of sets of training data are obtained through spectral data of all sample products, wherein the types of the sample products are consistent with the types corresponding to the products to be classified, and in a specific implementation process, a plurality of products are selected from the products of each type as sample products. Each of the plurality of sets of training data includes: sets of spectral feature data of the sample product and identification information for identifying the sample product type. And averagely dividing the spectrum data of each sample product into a plurality of groups of spectrum characteristic data, wherein the spectrum characteristic data of each group of spectrum characteristic data contain the same number of spectrum data so as to obtain a plurality of groups of spectrum characteristic data of each sample product. The sequence of the spectrum data is arranged from small to large according to the wavelength, when the spectrum data is divided, the spectrum data is divided according to the arrangement sequence of the spectrum data, each group of spectrum characteristic data obtained after the division corresponds to a time step, and the time steps are increased one by one according to the arrangement sequence of the spectrum data contained in the spectrum characteristic data. For example, 228 spectral data are arranged according to the wavelength from small to large, the spectral data are divided into 12 groups of spectral feature data according to the arrangement sequence, each group contains 19 spectral data, the time step corresponding to the first group of spectral feature data is 1, the time step corresponding to the second group is 2, and the steps are sequentially added until the time step corresponding to the last group of spectral feature data is 12. The category to which each sample product belongs is known, and each sample product is identified by using identification information, wherein the identification information of the sample products of the same category is the same, and the identification information of the sample products of different categories is different. The manner of presentation of the identification information may be set manually by those skilled in the art in practicing the embodiments of the present invention, for example, using an english alphabet representation for each sample product, or using a numerical representation for each sample product, etc. In a specific embodiment of the present invention, different kinds of sample products are identified by binary numbers, for example, five kinds of products to be classified are used, and the identification information of the five kinds of sample products is respectively: 00001. 00010, 00100, 01000, 10000. And taking a plurality of groups of spectral characteristic data of each sample product and the identification information corresponding to the sample product as a group of training data, and forming a plurality of groups of training data by the training data of all sample products.
Step 220: and training the Bi-directional threshold circulation unit Bi-GRU model according to the plurality of sets of training data to obtain a classification model.
In this step, the Bi-GRU model includes at least one Bi-GRU layer for learning the dependency relationship between the sets of spectral feature data of each sample product and one full connection layer for outputting the classification result. In some embodiments, the Bi-GRU model comprises two Bi-GRU layers, wherein a plurality of sets of spectral feature data of a plurality of sample products are selected from all sample products, and input into a first Bi-GRU layer, the first Bi-GRU layer learns the dependency relationship between the plurality of sets of spectral feature data of each sample product, and outputs a first external state of each sample product. The number of sample products selected each time can be within a hardware tolerance range, and is manually defined by a person skilled in the art when implementing the embodiment of the present invention, for example, the size of the video memory determines that the number of sample products that can be processed each time is 50, and then multiple sets of spectral feature data corresponding to any number of sample products between 1 and 50 can be selected each time and input into the first Bi-GRU layer. The number of sample products selected and processed each time is smaller than or equal to the total number of sample products, and one epoch is completed through multiple iterations.
The first external state output by the first Bi-GRU layer is used as input of the second Bi-GRU layer, the second Bi-GRU layer continuously learns the dependency relationship among multiple groups of spectral feature data of each sample product according to the first external state, and the second external state is output. The first external state and the second external state are each a multidimensional array for representing the plurality of groups of spectral feature data, and elements in the multidimensional array are used for representing the dependency relationship among the plurality of groups of spectral feature data. The dependency relationship between the plurality of sets of spectral feature data characterized by the second external state is a dependency relationship learned on the basis of the first external state, and therefore the second external state is more capable of representing the dependency relationship between the plurality of sets of spectral feature data.
The first Bi-GRU layer and the second Bi-GRU layer have the same structure, and the learning process for input data is the same, and by taking one Bi-GRU layer as an example, fig. 3 shows a schematic structural diagram of the Bi-GRU layer, and as shown in fig. 3, the Bi-GRU layer includes a forward GRU unit and a reverse GRU unit, the forward GRU unit and the reverse GRU unit are independent from each other, and the forward GRU unit and the reverse GRU unit have the same structure and have a memory function, and the difference between the two units is only that the dependency relationship is opposite. The forward GRU unit is used for learning forward dependency relationship among multiple groups of spectral characteristic data of each sample product so as to output forward external state of each sample product. The dependency relationship between the multiple groups of spectral feature data learned by the sample product according to the time step increasing sequence is a forward dependency relationship, for example, the dependency relationship between the spectral feature data learning of time step 2 and the spectral feature data learning of time step 1, the dependency relationship between the spectral feature data learning of time step 3 and the spectral feature data learning of time step 2, and so on, until the dependency relationship between the last group of spectral feature data learning and the last group of spectral feature data, and outputting the dependency relationship between every two spectral feature data as a forward external state.
The reverse GRU unit is used for learning the reverse dependency relationship among the plurality of groups of spectral characteristic data of each sample product so as to output the reverse external state of each sample product. The dependency relationship between the multiple groups of spectrum characteristic data learned according to the descending order of time steps of the sample product is an inverse dependency relationship, for example, the dependency relationship between the spectrum characteristic data learning of time step 11 and the spectrum characteristic data learning of time step 12, the dependency relationship between the spectrum characteristic data learning of time step 10 and the spectrum characteristic data learning of time step 11, and so on, until the dependency relationship between the spectrum characteristic data learning of time step 1 and the spectrum characteristic data learning of time step 2 is used as an inverse external state to output the dependency relationship between every two spectrum characteristic data.
And combining the forward external state output by the forward GRU unit and the reverse external state output by the reverse GRU unit to obtain a first external state of each sample product. Wherein, the output processes of the forward external state and the reverse external state are only opposite in direction, and the output processes are the same. Taking the output process of the forward external state as an example, the output process of the forward external state will be further described.
Referring to fig. 3a, fig. 3a shows a schematic structural diagram of a forward GRU unit, where the forward GRU unit includes a reset gate and an update gate, and memory information of second spectral feature data is recorded in the forward GRU unit, and the memory information of the second spectral feature data can be regarded as dependency information between the second spectral feature data and a set of spectral feature data thereon. The second spectral feature data and the input first spectral feature data are spectral feature data of the same sample product, and the first spectral feature data is a next set of spectral feature data of the second spectral feature data, i.e. a time step of the first spectral feature data is delayed by one time step unit from a time step of the second spectral feature data. For example, the first input spectral feature data corresponds to a time step of 10, that is, the set of spectral feature data is located in the tenth set when divided, and the second spectral feature data is a set of spectral feature data with a time step of 9. According to the first spectral feature data and the memory information of the second spectral feature data recorded by the forward GRU unit, the gating signals of the reset gate and the update gate are calculated according to the following formula:
r t =σ(W r x t +U r h t-1 )
z t =σ(W z x t +U z h t-1 )
wherein r is t And z t Represents the gating signal of the reset gate and the gating signal of the update gate, respectively, sigma represents the sigmoid function, W r 、U r And W is z 、U z Respectively representing a weight matrix of the reset gate and a weight matrix of the update gate, h t-1 Memory information, x, representing second spectral feature data t Representing the first spectral feature information.
According to the gating signal of the reset gate and the first spectrum characteristic data, the retention amount of the memory information of the second spectrum characteristic data is determined according to the following formula:
Figure BDA0002183321450000121
wherein h is t ' represents the retention of the memory information of the second spectral feature data, h t-1 Memory information representing second spectral feature data, W and U representing a weight matrix of forward GRU cells.
According to the reserve h t Gate control signal z of' sum update gate t Memory information of the first spectral feature data is determined according to the following formula:
Figure BDA0002183321450000122
wherein h is t Memory information, z, representing first spectral data t A gating signal h representing an update gate t-1 Memory information, h, representing second spectral feature data t ' represents the retention of the memory information of the second spectral feature data.
And the forward GRU unit circularly updates the memory information according to the mode until the memory information of the last group of spectral feature data is obtained, and outputs the memory information of the last group of spectral feature data as a forward external state. The reverse external state output by the reverse GRU unit is the memory information of the first group of spectral feature data in the time step dimension. And combining the forward external state and the reverse external state to obtain a first output state of the first Bi-GRU layer.
The second Bi-GRU layer continuously learns the dependency relationship between the multiple sets of spectral feature data of each sample product according to the first external state, and outputs the second external state, and the output process of the second external state is similar to the output process of the first external state, and only the input data is changed from the multiple sets of spectral feature data to the first external state, so that the output process of the second Bi-GRU layer is described with reference to fig. 3a, and is not repeated here.
And weighting the second external states through the full-connection layer, wherein the number of neurons corresponding to the full-connection layer is the number of types contained in all sample products, each external state in the second external states corresponds to a preset weight, and after the weighting, a weighting result of each sample product corresponding to each type is obtained. And (5) classifying and maximizing the output of the weighted results through a softmax classifier to obtain the output results corresponding to each sample product. The softmax classifier uses a softmax function, the function is a normalized exponential function, the weighted result can be mapped to a range from 0 to 1, the mapped value can be regarded as a probability value of each classification, and the category corresponding to the maximum value is the prediction category corresponding to the sample product. The specific calculation formula of the output result is as follows:
Figure BDA0002183321450000131
Wherein o is i A second external state s representing the output of the second Bi-GRU layer θ (o i ) Representing the output result of the softmax classifier on the ith sample product, θ represents the weight matrix of the full connection layer, k represents the class number of the sample product, for example, the number of sample product classes to be classified is 5, and k=5.
The loss function value is calculated according to the maximum value in the output result of each sample product and the identification information of the sample product, wherein the specific expression of the loss function can be selected by a person skilled in the art, and the embodiment of the invention does not limit the category of the loss function. In a specific embodiment, the loss function is a cross entropy loss function comprising an L2 regularization term, whose specific calculation formula is as follows:
Figure BDA0002183321450000132
wherein J represents a loss function value, m represents the number of samples input, q represents the number of neurons of the second Bi-GRU layer, i.e., the dimension of the second external state, θ lj Represents the weight between the first layer neuron of the second Bi-GRU layer and the j-th neuron of the full connection layer, and lambda represents the weight factor.
And updating the weight of the Bi-GRU model according to the loss function value until the loss function value group is minimum, and taking the Bi-GRU model with the minimum loss function value as a classification model. When the weight is updated, the method is conducted Correction weight θ is corrected by back propagation lj Until the loss function value is minimal. And taking the weight value which minimizes the loss function and the corresponding Bi-GRU model structure as a classification model.
Step 230: and acquiring spectrum data of the product to be classified.
In this step, the number of spectral data of the product to be classified is identical to the number of spectral data of each sample product.
Step 240: and averagely dividing the spectrum data of the product to be classified into a plurality of groups of spectrum characteristic data, wherein each group of spectrum characteristic data contains the same spectrum data, and one group of spectrum characteristic data corresponds to one time step.
In this step, the division of the spectral data of the product to be classified is consistent with the division of the spectral data of the sample product when the classification model is trained, for example, 228 spectral data contained in each sample product is divided into 12 time steps, each time step contains 19 spectral data, and when the 228 spectral data of the product to be classified is divided, the division is also divided into 12 time steps, each time step contains 19 spectral data.
Step 250: and inputting a plurality of groups of spectral feature data of the products to be classified into a classification model to determine the dependency relationship among the plurality of groups of spectral feature data based on time steps, and obtaining the types of the products to be classified based on the dependency relationship.
According to the embodiment of the invention, the Bi-GRU model is used for training a plurality of groups of training data to obtain the classification model, and the Bi-GRU model not only can learn the forward dependency relationship of a plurality of groups of spectral feature data, but also can learn the reverse dependency relationship of a plurality of groups of spectral feature data, so that the classification model obtained by training according to the Bi-GRU model comprises the dependency relationship in two directions between the spectral feature data corresponding to each type of sample product, and the products to be classified can be reliably classified.
Fig. 4 shows a functional block diagram of a product classification device according to a third embodiment of the invention. As shown in fig. 4, the apparatus includes: the device comprises an acquisition module 410, a segmentation module 420 and an input module 430, wherein the acquisition module 410 is used for acquiring spectrum data of a product to be classified. The dividing module 420 is configured to divide the spectrum data of the product to be classified into a plurality of sets of spectrum feature data, where each set of spectrum feature data includes the same number of spectrum data, and one set of spectrum feature data corresponds to one time step. The input module 430 is configured to input multiple sets of spectral feature data into a classification model, to determine a dependency relationship between the multiple sets of spectral feature data based on a time step, and obtain a type of a product to be classified based on the dependency relationship, where the classification model is obtained by training a time-cycled neural network model with multiple sets of training data.
In an alternative, the apparatus further comprises: a first obtaining module 440, configured to obtain a plurality of sets of training data, where each set of the plurality of sets of training data includes: sets of spectral feature data of the sample product and identification information for identifying the sample product type. And the training module 450 is configured to train the Bi-directional threshold cycle unit Bi-GRU model according to the multiple sets of training data, so as to obtain a classification model.
In an alternative manner, the first obtaining module 440 is further configured to obtain spectral data of all sample products; dividing the spectrum data of each sample product into a plurality of groups of spectrum characteristic data on average, wherein the spectrum characteristic data of each group of spectrum characteristic data contain the same number of spectrum data so as to obtain a plurality of groups of spectrum characteristic data of each sample product; marking each sample product to obtain the identification information of each sample product, wherein the identification information of the sample products of the same kind is the same, and the identification information of the sample products of different kinds is different; and taking a plurality of groups of spectral characteristic data and identification information of one sample product as one group of training data to obtain the plurality of groups of training data.
In an alternative manner, the Bi-GRU model includes two Bi-GRU layers for learning the dependency relationship between the plurality of sets of spectral data of each sample product and one fully connected layer for outputting the classification result.
In an alternative manner, training module 450 is further configured to input multiple sets of spectral feature data of the sample product into the first Bi-GRU layer; learning a dependency relationship among a plurality of groups of spectral feature data of each sample product through a first Bi-GRU layer so as to output a first external state of each sample product; continuously learning the dependency relationship among the multiple groups of spectral feature data of each sample product according to the first external state through a second Bi-GRU layer so as to output a second external state of each sample product; weighting the second external state through the full connection layer to obtain a weighted result of each sample product; the weighted results are classified and maximized through a normalized exponential function softmax classifier, and output results corresponding to each sample product are obtained; calculating a loss function value according to the output result and the identification information; updating the weight of the Bi-GRU model according to the loss function value until the loss function value is minimum; and taking the Bi-GRU model with the smallest loss function value as the classification model.
In an alternative manner, the first Bi-GRU layer includes a forward GRU unit and a reverse GRU unit, the forward GRU unit and the reverse GRU unit being independent of each other, the training module 450 is further configured to: the forward dependency relationship among multiple groups of spectral feature data of each sample product is learned through the forward GRU unit so as to output the forward external state of each sample product; learning inverse dependency relations among multiple groups of spectral feature data of each sample product through the inverse GRU unit so as to output an inverse external state of each sample product; and combining the forward external state and the reverse external state to obtain a first external state of each sample product.
In an alternative manner, the forward GRU unit includes a reset gate and an update gate, and the training module 450 is further configured to calculate gating signals of the reset gate and the update gate according to a first spectral feature data and a memory information of a second spectral feature data recorded by the forward GRU unit, where the first spectral feature data and the second spectral feature data are spectral feature data of a same sample product, and the first spectral feature data is a next set of spectral feature data of the second spectral feature data; determining the memory information retention amount of the second spectrum characteristic data according to the gating signal of the reset gate and the first spectrum characteristic data; determining memory information of the first spectral feature data according to the reserved quantity and a gating signal of the updating gate; and outputting the memory information of the last set of spectral feature data of each sample product as the forward external state.
In an alternative approach, training module 450 is further to:
according to the first spectral feature data and the memory information of the second spectral feature data recorded by the GRU neuron, the gating signals of the reset gate and the update gate are calculated according to the following formula:
r t =σ(W r x t +U r h t-1 )
z t =σ(W z x t +U z h t-1 )
Wherein r is t And z t Representing the gating signal of the reset gate and the gating signal of the update gate, respectively, sigma represents a sigmoid function, W r 、U r And W is z 、U z Respectively representing the weight matrix of the reset gate and the weight matrix of the update gate, h t-1 Memory information, x, representing the second spectral feature data t Representing the first spectral feature information.
In an alternative approach, training module 450 is further to:
according to the gating signal of the reset gate and the first spectrum characteristic data, the retention of the memory information of the second spectrum characteristic data is determined according to the following formula:
Figure BDA0002183321450000161
wherein h is t ' means a retention amount, h, of memory information of the second spectral feature data t-1 Memory information representing the second spectral feature data, W and U representing a weight matrix of the forward GRU elements.
The training module 450 is further configured to:
and determining the memory information of the first spectral feature data according to the reserved quantity and the gating signal of the updating gate by the following formula:
Figure BDA0002183321450000162
wherein h is t Memory information, z, representing the first spectral data t A gating signal h representing an update gate t-1 Memory information, h, representing the second spectral feature data t ' represents a retention of memory information of the second spectral feature data.
In the embodiment of the invention, the acquired spectral data of the product to be classified is segmented by the segmentation module 420 to obtain a plurality of groups of spectral feature data of the product to be classified, the plurality of groups of spectral feature data are input into the classification model by the input module 430 to obtain the type of the product to be classified, and the classification model is obtained by training the time-cyclic neural network model by a plurality of groups of training data, so that the classification model comprises the dependency relations among the plurality of groups of spectral feature data and the product classification results corresponding to the plurality of groups of spectral feature data with different dependency relations, and therefore, the product to be classified can be accurately classified according to the classification model.
The embodiment of the invention provides a non-volatile computer storage medium, which stores at least one executable instruction, and the computer executable instruction can execute the operation corresponding to one product classification method in any method embodiment.
FIG. 5 is a schematic diagram of a computing device according to a fourth embodiment of the present invention, and the embodiment of the present invention is not limited to the specific implementation of the computing device.
As shown in fig. 5, the computing device may include: a processor 502, a communication interface (Communications Interface) 504, a memory 506, and a communication bus 508.
Wherein: processor 502, communication interface 504, and memory 506 communicate with each other via communication bus 508. A communication interface 504 for communicating with network elements of other devices, such as clients or other servers. The processor 502 is configured to execute the program 510, and may specifically perform the relevant steps in the embodiment of the product classification method described above.
In particular, program 510 may include program code including computer-operating instructions.
The processor 502 may be a central processing unit CPU, or a specific integrated circuit ASIC (Application Specific Integrated Circuit), or one or more integrated circuits configured to implement embodiments of the present invention. The one or more processors included by the computing device may be the same type of processor, such as one or more CPUs; but may also be different types of processors such as one or more CPUs and one or more ASICs.
A memory 506 for storing a program 510. Memory 506 may comprise high-speed RAM memory or may also include non-volatile memory (non-volatile memory), such as at least one disk memory.
Program 510 may be specifically configured to cause processor 502 to perform steps 110-130 of fig. 1, steps 210-250 of fig. 2, and to implement the functions of modules 410-450 of fig. 4.
The algorithms or displays presented herein are not inherently related to any particular computer, virtual system, or other apparatus. Various general-purpose systems may also be used with the teachings herein. The required structure for a construction of such a system is apparent from the description above. In addition, embodiments of the present invention are not directed to any particular programming language. It will be appreciated that the teachings of the present invention described herein may be implemented in a variety of programming languages, and the above description of specific languages is provided for disclosure of enablement and best mode of the present invention.
In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the above description of exemplary embodiments of the invention, various features of the embodiments of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be construed as reflecting the intention that: i.e., the claimed invention requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
Those skilled in the art will appreciate that the modules in the apparatus of the embodiments may be adaptively changed and disposed in one or more apparatuses different from the embodiments. The modules or units or components of the embodiments may be combined into one module or unit or component and, furthermore, they may be divided into a plurality of sub-modules or sub-units or sub-components. Any combination of all features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or units of any method or apparatus so disclosed, may be used in combination, except insofar as at least some of such features and/or processes or units are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings), may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments herein include some features but not others included in other embodiments, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments. For example, in the following claims, any of the claimed embodiments can be used in any combination.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The use of the words first, second, third, etc. do not denote any order. These words may be interpreted as names. The steps in the above embodiments should not be construed as limiting the order of execution unless specifically stated.

Claims (13)

1. A method of classifying a product, the method comprising:
acquiring spectrum data of a product to be classified;
Dividing the spectrum data of the product to be classified into a plurality of groups of spectrum characteristic data according to the arrangement sequence of the spectrum data, wherein the spectrum data contained in each group of spectrum characteristic data has the same number, and one group of spectrum characteristic data corresponds to one time step, the spectrum data are ordered from small to large according to the wavelength, and the time steps are increased one by one according to the arrangement sequence of the spectrum data contained in the spectrum characteristic data;
inputting the multiple groups of spectral feature data into a classification model to determine the dependency relationship among the multiple groups of spectral feature data based on the time step, and obtaining the types of the products to be classified based on the dependency relationship, wherein the classification model is obtained by training a time-cyclic neural network model through multiple groups of training data.
2. The method of claim 1, wherein prior to acquiring the spectral data of the product to be classified, the method further comprises:
acquiring multiple sets of training data, each set of the multiple sets of training data comprising: a plurality of sets of spectral feature data of the sample product and identification information for identifying the type of the sample product;
and training the Bi-directional threshold circulation unit Bi-GRU model according to the multiple sets of training data to obtain the classification model.
3. The method of claim 2, wherein obtaining multiple sets of training data comprises:
acquiring spectrum data of all sample products;
dividing the spectrum data of each sample product into a plurality of groups of spectrum characteristic data on average, wherein the spectrum characteristic data of each group of spectrum characteristic data contain the same number of spectrum data so as to obtain a plurality of groups of spectrum characteristic data of each sample product;
marking each sample product to obtain the identification information of each sample product, wherein the identification information of the sample products of the same kind is the same, and the identification information of the sample products of different kinds is different;
and taking a plurality of groups of spectral characteristic data and identification information of one sample product as one group of training data to obtain the plurality of groups of training data.
4. The method of claim 2, wherein the Bi-GRU model comprises two Bi-GRU layers for learning the dependency relationship between the sets of spectral feature data of each sample product and one fully connected layer for outputting the classification result.
5. The method of claim 2, wherein training the Bi-GRU model based on the plurality of sets of training data to obtain a classification model comprises:
Inputting multiple sets of spectral feature data of the sample product into a first Bi-GRU layer;
learning a dependency relationship among a plurality of groups of spectral feature data of each sample product through a first Bi-GRU layer so as to output a first external state of each sample product;
continuously learning the dependency relationship among the multiple groups of spectral feature data of each sample product according to the first external state through a second Bi-GRU layer so as to output a second external state of each sample product;
weighting the second external state through the full connection layer to obtain a weighted result of each sample product;
the weighted results are classified and maximized through a normalized exponential function softmax classifier, and output results corresponding to each sample product are obtained;
calculating a loss function value according to the output result and the identification information;
updating the weight of the Bi-GRU model according to the loss function value until the loss function value is minimum;
and taking the Bi-GRU model with the smallest loss function value as the classification model.
6. The method of claim 5, wherein the first Bi-GRU layer comprises a forward GRU unit and a reverse GRU unit, the forward GRU unit and the reverse GRU unit being independent of each other;
The learning, by the first Bi-GRU layer, the dependency relationship between the plurality of sets of spectral feature data of each sample product to output a first external state of each sample product includes:
the forward dependency relationship among multiple groups of spectral feature data of each sample product is learned through the forward GRU unit so as to output the forward external state of each sample product;
learning inverse dependency relations among multiple groups of spectral feature data of each sample product through the inverse GRU unit so as to output an inverse external state of each sample product;
and combining the forward external state and the reverse external state to obtain a first external state of each sample product.
7. The method of claim 6, wherein the forward GRU unit includes a reset gate and an update gate;
the forward direction dependency relation among the multiple groups of spectral feature data of each sample product is learned through the forward direction GRU unit, and the forward direction external state of each sample product is output, which comprises the following steps:
according to the memory information of the first spectral feature data and the second spectral feature data recorded by the forward GRU unit, calculating gating signals of the reset gate and the update gate, wherein the first spectral feature data and the second spectral feature data are spectral feature data of the same sample product, and the first spectral feature data are the next group of spectral feature data of the second spectral feature data;
Determining the memory information retention amount of the second spectrum characteristic data according to the gating signal of the reset gate and the first spectrum characteristic data;
determining memory information of the first spectral feature data according to the reserved quantity and a gating signal of the updating gate;
and outputting the memory information of the last set of spectral feature data of each sample product as the forward external state.
8. The method of claim 7, wherein calculating the gating signals for the reset gate and the update gate based on the first spectral feature data and the memory information for the second spectral feature data recorded by the forward GRU unit comprises:
according to the memory information of the first spectral feature data and the second spectral feature data recorded by the forward GRU unit, the gating signals of the reset gate and the update gate are calculated according to the following formula:
r t =σ(W r x t +U r h t-1 )
z t =σ(W z x t +U z h t-1 )
wherein r is t And z t Representing the gating signal of the reset gate and the gating signal of the update gate, respectively, sigma represents a sigmoid function, W r 、U r And W is z 、U z Respectively representing the weight matrix of the reset gate and the weight matrix of the update gate, h t-1 Memory information, x, representing the second spectral feature data t Representing the first spectral feature data.
9. The method of claim 8, wherein determining the retention of memory information for the second spectral feature data based on the gating signal for the reset gate and the first spectral feature data comprises:
according to the gating signal of the reset gate and the first spectrum characteristic data, the retention of the memory information of the second spectrum characteristic data is determined according to the following formula:
Figure FDA0004060430180000041
wherein h is t ' means a retention amount, h, of memory information of the second spectral feature data t-1 Memory information representing the second spectral feature data, W and U representing a weight matrix of the forward GRU elements.
10. The method of claim 9, wherein determining the memory information of the first spectral feature data based on the hold and the gate signal of the update gate comprises:
and determining the memory information of the first spectral feature data according to the reserved quantity and the gating signal of the updating gate by the following formula:
Figure FDA0004060430180000042
wherein h is t Memory information, z, representing the first spectral data t A gating signal h representing an update gate t-1 Memory information, h, representing the second spectral feature data t ' represents a retention of memory information of the second spectral feature data.
11. A product classification device, the device comprising:
the acquisition module is used for acquiring spectrum data of the product to be classified;
the splitting module is used for averagely splitting the spectrum data of the product to be classified into a plurality of groups of spectrum characteristic data according to the arrangement sequence of the spectrum data, wherein the spectrum characteristic data in each group of spectrum characteristic data contain the same spectrum data, one group of spectrum characteristic data corresponds to one time step, the spectrum data are ordered from small to large according to the wavelength, and the time steps are increased one by one according to the arrangement sequence of the spectrum data contained in the spectrum characteristic data;
the input module is used for inputting the multiple groups of spectral feature data into a classification model so as to determine the dependency relationship among the multiple groups of spectral feature data based on the time step, and obtaining the type of the product to be classified based on the dependency relationship, wherein the classification model is obtained by training a time-circulating neural network model through multiple groups of training data.
12. A computing device, comprising: the device comprises a processor, a memory, a communication interface and a communication bus, wherein the processor, the memory and the communication interface complete communication with each other through the communication bus;
The memory is configured to store at least one executable instruction that causes the processor to perform operations corresponding to a product classification method according to any one of claims 1-10.
13. A computer storage medium having stored therein at least one executable instruction for causing a processor to perform operations corresponding to a product classification method according to any one of claims 1-10.
CN201910804795.4A 2019-08-28 2019-08-28 Product classification method, device, computing equipment and computer storage medium Active CN110646350B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910804795.4A CN110646350B (en) 2019-08-28 2019-08-28 Product classification method, device, computing equipment and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910804795.4A CN110646350B (en) 2019-08-28 2019-08-28 Product classification method, device, computing equipment and computer storage medium

Publications (2)

Publication Number Publication Date
CN110646350A CN110646350A (en) 2020-01-03
CN110646350B true CN110646350B (en) 2023-06-02

Family

ID=68991029

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910804795.4A Active CN110646350B (en) 2019-08-28 2019-08-28 Product classification method, device, computing equipment and computer storage medium

Country Status (1)

Country Link
CN (1) CN110646350B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111579724B (en) * 2020-06-01 2022-07-12 中国标准化研究院 Rapid classification method and device for sensory sensitivity of tingling and peppery suprathreshold and application

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201156043Y (en) * 2008-02-26 2008-11-26 浙江大学 Non-destruction detector for synthetic quality of food
WO2016142692A1 (en) * 2015-03-06 2016-09-15 Micromass Uk Limited Spectrometric analysis
CN107169535B (en) * 2017-07-06 2023-11-03 谈宜勇 Deep learning classification method and device for biological multispectral image
CN108197751A (en) * 2018-01-23 2018-06-22 国网山东省电力公司电力科学研究院 Seq2seq network Short-Term Load Forecasting Methods based on multilayer Bi-GRU
CN108921285B (en) * 2018-06-22 2021-05-25 西安理工大学 Bidirectional gate control cyclic neural network-based classification method for power quality disturbance
CN109002771B (en) * 2018-06-26 2022-04-08 中国科学院遥感与数字地球研究所 Remote sensing image classification method based on recurrent neural network
CN109063773B (en) * 2018-08-03 2021-07-27 华中科技大学 Method for improving laser probe classification precision by using image features
CN109632693A (en) * 2018-12-10 2019-04-16 昆明理工大学 A kind of tera-hertz spectra recognition methods based on BLSTM-RNN

Also Published As

Publication number Publication date
CN110646350A (en) 2020-01-03

Similar Documents

Publication Publication Date Title
Liu et al. Application of deep convolutional neural networks for detecting extreme weather in climate datasets
Raikar et al. Classification and Grading of Okra-ladies finger using Deep Learning
US20170372169A1 (en) Method and apparatus for recognizing image content
Marchant et al. Automated analysis of foraminifera fossil records by image classification using a convolutional neural network
CN111832650B (en) Image classification method based on generation of antagonism network local aggregation coding semi-supervision
Zhao et al. SEV‐Net: Residual network embedded with attention mechanism for plant disease severity detection
CN111291809A (en) Processing device, method and storage medium
CN111783841A (en) Garbage classification method, system and medium based on transfer learning and model fusion
CN110826379A (en) Target detection method based on feature multiplexing and YOLOv3
Qiao et al. Detection and classification of early decay on blueberry based on improved deep residual 3D convolutional neural network in hyperspectral images
Dhiman et al. A general purpose multi-fruit system for assessing the quality of fruits with the application of recurrent neural network
Prakash et al. An intelligent fruits classification in precision agriculture using bilinear pooling convolutional neural networks
Shoohi et al. DCGAN for Handling Imbalanced Malaria Dataset based on Over-Sampling Technique and using CNN.
Bhole et al. Analysis of convolutional neural network using pre-trained squeezenet model for classification of thermal fruit images
Bhole et al. A transfer learning-based approach to predict the shelf life of fruit
CN110646350B (en) Product classification method, device, computing equipment and computer storage medium
CN110738246B (en) Product classification method, device, computing equipment and computer storage medium
Rafidison et al. Image Classification Based on Light Convolutional Neural Network Using Pulse Couple Neural Network
Xu et al. Improved residual network for automatic classification grading of lettuce freshness
CN110309879A (en) A kind of jujube exterior quality classification method, device, equipment and storage medium
Hou et al. FMRSS Net: Fast Matrix Representation‐Based Spectral‐Spatial Feature Learning Convolutional Neural Network for Hyperspectral Image Classification
Srivastava et al. A fruit recognition system based on modern deep learning technique
Adigun et al. Automated system for grading apples using convolutional neural network
Asrol et al. Real-Time Oil Palm Fruit Grading System Using Smartphone and Modified YOLOv4
Işık et al. Consensus rule for wheat cultivar classification on VL, VNIR and SWIR imaging

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20200407

Address after: 1706, Fangda building, No. 011, Keji South 12th Road, high tech Zone, Yuehai street, Nanshan District, Shenzhen City, Guangdong Province

Applicant after: Shenzhen shuliantianxia Intelligent Technology Co.,Ltd.

Address before: 518000, building 10, building ten, building D, Shenzhen Institute of Aerospace Science and technology, 6 hi tech Southern District, Nanshan District, Shenzhen, Guangdong 1003, China

Applicant before: SHENZHEN H & T HOME ONLINE NETWORK TECHNOLOGY Co.,Ltd.

GR01 Patent grant
GR01 Patent grant