CN110646350A - Product classification method and device, computing equipment and computer storage medium - Google Patents

Product classification method and device, computing equipment and computer storage medium Download PDF

Info

Publication number
CN110646350A
CN110646350A CN201910804795.4A CN201910804795A CN110646350A CN 110646350 A CN110646350 A CN 110646350A CN 201910804795 A CN201910804795 A CN 201910804795A CN 110646350 A CN110646350 A CN 110646350A
Authority
CN
China
Prior art keywords
spectral
data
feature data
spectral feature
gru
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910804795.4A
Other languages
Chinese (zh)
Other versions
CN110646350B (en
Inventor
尹海波
金欢欢
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Shuliantianxia Intelligent Technology Co Ltd
Original Assignee
Shenzhen Heertai Home Furnishing Online Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Heertai Home Furnishing Online Network Technology Co Ltd filed Critical Shenzhen Heertai Home Furnishing Online Network Technology Co Ltd
Priority to CN201910804795.4A priority Critical patent/CN110646350B/en
Publication of CN110646350A publication Critical patent/CN110646350A/en
Application granted granted Critical
Publication of CN110646350B publication Critical patent/CN110646350B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/31Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/31Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry
    • G01N21/35Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry using infrared light
    • G01N21/3563Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry using infrared light for analysing solids; Preparation of samples therefor
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N33/00Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
    • G01N33/02Food
    • G01N33/025Fruits or vegetables
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Health & Medical Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Pathology (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • Immunology (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Software Systems (AREA)
  • Food Science & Technology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Medicinal Chemistry (AREA)
  • Medical Informatics (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the invention relates to the technical field of machine learning, and discloses a product classification method, which comprises the following steps: acquiring spectral data of a product to be classified; averagely dividing the spectral data of the product to be classified into a plurality of groups of spectral feature data, wherein the number of the spectral data contained in each group of spectral feature data is the same, and one group of spectral feature data corresponds to one time step; and inputting the multiple groups of spectral feature data into a classification model to determine the dependency relationship among the multiple groups of spectral feature data based on the time step, and obtaining the type of the product to be classified based on the dependency relationship, wherein the classification model is obtained by training a time cycle neural network model through multiple groups of training data. Through the mode, the embodiment of the invention realizes the classification of the products to be classified.

Description

Product classification method and device, computing equipment and computer storage medium
Technical Field
The embodiment of the invention relates to the technical field of machine learning, in particular to a product classification method, a product classification device, a computing device and a computer storage medium.
Background
With the improvement of living standard of people, the quality requirements of various products, such as fruits, vegetables and the like, are higher and higher. Quality detection and classification of products have become important links in standardized processing procedures for various products.
Currently, methods for product classification rely primarily on manual work.
Disclosure of Invention
In view of the above, embodiments of the present invention provide a product classification method, apparatus, computing device and computer storage medium, which overcome or at least partially solve the above problems.
According to an aspect of an embodiment of the present invention, there is provided a product classification method, including:
acquiring spectral data of a product to be classified;
averagely dividing the spectral data of the product to be classified into a plurality of groups of spectral feature data, wherein the number of the spectral data contained in each group of spectral feature data is the same, and one group of spectral feature data corresponds to one time step;
and inputting the multiple groups of spectral feature data into a classification model to determine the dependency relationship among the multiple groups of spectral feature data based on the time step, and obtaining the type of the product to be classified based on the dependency relationship, wherein the classification model is obtained by training a time cycle neural network model through multiple groups of training data.
In an alternative form, prior to obtaining spectral data for a product to be classified, the method further comprises:
obtaining a plurality of sets of training data, each of the plurality of sets of training data comprising: the method comprises the steps of obtaining a plurality of sets of spectral feature data of a sample product and identification information for identifying the type of the sample product;
and training the Bi-GRU model of the bidirectional threshold cycle unit according to the plurality of groups of training data to obtain the classification model.
In an alternative approach, obtaining multiple sets of training data includes:
acquiring spectral data of all sample products;
averagely dividing the spectral data of each sample product into a plurality of groups of spectral characteristic data, wherein the number of the spectral data contained in each group of spectral characteristic data is the same, so as to obtain a plurality of groups of spectral characteristic data of each sample product;
identifying each sample product to obtain identification information of each sample product, wherein the identification information of the sample products of the same kind is the same, and the identification information of the sample products of different kinds is different;
and taking the multiple groups of spectral feature data and the identification information of a sample product as a group of training data to obtain the multiple groups of training data.
In an alternative approach, the Bi-GRU model contains two Bi-GRU layers for learning the dependency relationship between sets of spectral feature data of each sample product and one full-connection layer for outputting the classification result.
In an optional manner, training the Bi-GRU model according to the multiple sets of training data to obtain a classification model includes:
inputting a plurality of sets of spectral feature data of the sample product into a first Bi-GRU layer;
learning the dependency relationship among the multiple groups of spectral feature data of each sample product through the first Bi-GRU layer to output a first external state of each sample product;
continuously learning the dependency relationship among the multiple groups of spectral feature data of each sample product according to the first external state through a second Bi-GRU layer so as to output a second external state of each sample product;
weighting the second external state through a full connection layer to obtain a weighting result of each sample product;
performing classification maximization output on the weighting result through a normalization index function softmax classifier to obtain an output result corresponding to each sample product;
calculating a loss function value according to the output result and the identification information;
updating the weight of the Bi-GRU model according to the loss function value until the loss function value is minimum;
and taking the Bi-GRU model with the minimum loss function value as the classification model.
In an alternative approach, the first Bi-GRU layer includes a forward GRU unit and a reverse GRU unit, the forward GRU unit and the reverse GRU unit being independent of each other;
learning, by the first Bi-GRU layer, a dependency relationship between the plurality of sets of spectral feature data of each sample product to output a first external state of each sample product, including:
learning a forward dependency relationship among a plurality of groups of spectral feature data of each sample product through the forward GRU unit, and outputting a forward external state of each sample product;
learning the inverse dependency relationship among the multiple groups of spectral feature data of each sample product through the inverse GRU unit, and outputting the inverse external state of each sample product;
and combining the forward external state and the reverse external state to obtain a first external state of each sample product.
In an alternative approach, the forward GRU unit includes a reset gate and an update gate, and the learning of the forward dependency relationship between the sets of spectral feature data of each sample product by the forward GRU unit and the outputting of the forward external state of each sample product includes:
according to first spectral characteristic data and memory information of second spectral characteristic data recorded by the forward GRU unit, gate control signals of the reset gate and the update gate are calculated, wherein the first spectral characteristic data and the second spectral characteristic data are spectral characteristic data of the same sample product, and the first spectral characteristic data is a next group of spectral characteristic data of the second spectral characteristic data;
determining the retention amount of the memory information of the second spectral characteristic data according to the gating signal of the reset gate and the first spectral characteristic data;
determining memory information of the first spectral feature data according to the retention amount and the gating signal of the update gate;
and outputting the memory information of the last group of spectral characteristic data of each sample product as the forward external state.
In an alternative form, calculating the reset gate and the gating signals of the reset gate and the update gate according to the following formulas based on the first spectral characteristic data and the memory information of the second spectral characteristic data recorded by the forward GRU unit comprises: according to the first spectral characteristic data and the memory information of the second spectral characteristic data recorded by the GRU neuron, gate control signals of the reset gate and the update gate are calculated according to the following formula:
rt=σ(Wrxt+Urht-1)
zt=σ(Wzxt+Uzht-1)
wherein r istAnd ztRespectively representing the gate control signal of the reset gate and the gate control signal of the update gate, sigma representing a sigmoid function, Wr、UrAnd Wz、UzRespectively representing the weight matrix of the reset gate and the weight matrix of the update gate, ht-1Memory information, x, representing the second spectral feature datatThe spectral feature data representing the first.
In an alternative mode, determining the retention amount of the memory information of the second spectral feature data according to the gating signal of the reset gate and the first spectral feature data comprises:
determining the retention amount of the memory information of the second spectral characteristic data according to the following formula according to the gating signal of the reset gate and the first spectral characteristic data:
Figure BDA0002183321450000041
wherein h ist' retention amount of memory information representing the second spectral feature data, ht-1Memory information representing the second spectral feature data, W and U representing weight matrices for the forward GRU units.
In an alternative mode, determining the memory information of the first spectral feature data according to the reserved quantity and the gating signal of the update gate includes:
determining the memory information of the first spectral feature data according to the retention amount and the gating signal of the update gate and the following formula:
Figure BDA0002183321450000042
wherein h istMemory information, z, representing the first spectral datatGate signal, h, representing an update gatet-1Memory information, h, representing the second spectral feature datat' represents a retention amount of memory information of the second spectral feature data.
According to another aspect of the embodiments of the present invention, there is provided a product sorting apparatus including:
the acquisition module is used for acquiring the spectral data of the products to be classified;
the segmentation module is used for averagely segmenting the spectral data of the product to be classified into a plurality of groups of spectral feature data, the number of the spectral data contained in each group of spectral feature data is the same, and one group of spectral feature data corresponds to one time step;
and the input module is used for inputting the multiple groups of spectral feature data into a classification model so as to determine the dependency relationship among the multiple groups of spectral feature data based on the time step and obtain the type of the product to be classified based on the dependency relationship, and the classification model is obtained by training a time cycle neural network model through multiple groups of training data.
According to another aspect of embodiments of the present invention, there is provided a computing device including: the system comprises a processor, a memory, a communication interface and a communication bus, wherein the processor, the memory and the communication interface complete mutual communication through the communication bus;
the memory is used for storing at least one executable instruction, and the executable instruction enables the processor to execute the corresponding operation of the product classification method.
According to another aspect of the embodiments of the present invention, there is provided a computer storage medium having at least one executable instruction stored therein, where the executable instruction causes the processor to perform an operation corresponding to the product classification method.
The method and the device for classifying the products obtain multiple groups of spectral characteristic data of the products to be classified by segmenting the acquired spectral data of the products to be classified, and input the multiple groups of spectral characteristic data into the classification model to obtain the types of the products to be classified, wherein the classification model is obtained by training the time cycle neural network model through multiple groups of training data, so that the classification model comprises the dependency relationship among the multiple groups of spectral characteristic data and the product classification results corresponding to the multiple groups of spectral characteristic data with different dependency relationships, and the products to be classified can be accurately classified according to the classification model.
The foregoing description is only an overview of the technical solutions of the embodiments of the present invention, and the embodiments of the present invention can be implemented according to the content of the description in order to make the technical means of the embodiments of the present invention more clearly understood, and the detailed description of the present invention is provided below in order to make the foregoing and other objects, features, and advantages of the embodiments of the present invention more clearly understandable.
Drawings
Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the invention. Also, like reference numerals are used to refer to like parts throughout the drawings. In the drawings:
FIG. 1 is a flow chart of a product classification method according to a first embodiment of the present invention;
FIG. 2 is a flow chart illustrating a method for classifying products according to a second embodiment of the present invention;
FIG. 3 is a schematic structural diagram of a Bi-GRU layer in a product classification method according to a second embodiment of the present invention;
fig. 3a is a schematic structural diagram illustrating a forward GRU unit in a product classification method according to a second embodiment of the present invention;
fig. 4 shows a functional block diagram of a product sorting apparatus according to a third embodiment of the present invention;
fig. 5 is a schematic structural diagram of a computing device according to a fourth embodiment of the present invention.
Detailed Description
Exemplary embodiments of the present invention will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the invention are shown in the drawings, it should be understood that the invention can be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art.
The embodiment of the invention is suitable for application scenes for classifying products. Specifically, the embodiment of the present invention can be applied to an application scenario of classifying different kinds of products, such as classifying different fruits to identify the kinds of various fruits (e.g., apple, banana, etc.); the embodiment of the present invention may also be specifically applied to an application scenario in which the same type of product is further classified, for example, the apple type (e.g., green apple, red apple, etc.) specifically described for the same type of apple is identified. Not limited to the above listed application scenarios.
The general principle of the embodiment of the invention is as follows: based on the fact that spectral data generated by diffuse reflection of different products are different when the products are irradiated by laser, the spectral data are subjected to feature recognition and distinguishing, the spectral feature data corresponding to the products are learned by utilizing a time-cycle neural network model, the dependency relationship among the spectral feature data of the different products is determined, so that a classification model for classifying the products based on the spectral feature data of the products is obtained, the products can be classified through the classification model obtained by training, and the product classification efficiency is improved; the trained classification model can be used for determining the dependency relationship among multiple groups of spectral feature data of the product and product classification results corresponding to the multiple groups of spectral feature data with different dependency relationships, and compared with classification based on single features or multiple mutually independent features, the classification model is more reliable, and therefore the accuracy of product classification can be improved.
The technical solution of the embodiment of the present invention is specifically described below.
Fig. 1 shows a flow chart of a product classification method according to a first embodiment of the invention. As shown in fig. 1, the method comprises the steps of:
step 110: spectral data of the products to be classified are obtained.
The products to be classified in this step may be different kinds of products of the same kind, for example, apple, banana and orange among fruits; but also different kinds of products, e.g. fruits, vegetables; or different species of products from different species, e.g. apple and eggplant. The method comprises the steps that an electromagnetic wave transmitter in a certain wavelength range is used for transmitting electromagnetic waves to products to be classified, after the electromagnetic waves contact the products to be classified, the electromagnetic waves with partial wavelengths are absorbed by the products to be classified, the spectrum of the electromagnetic waves absorbed by the products to be classified is determined through spectrum acquisition equipment, and the spectrum data of the products to be classified are obtained. It should be noted that the electromagnetic wave spectra absorbed by different products are different, and therefore, the spectrum data obtained from the electromagnetic wave absorbed by the products can effectively distinguish the different products. The electromagnetic wave used in the embodiment of the present invention is an electromagnetic wave having a wavelength between microwave and visible light, for example, infrared light. When the product to be classified is irradiated by infrared light, the spectral data of the product to be classified is collected by the infrared spectrum detector. The number of the collected spectrum data is related to the frequency collected by the spectrum collecting device, for example, the electromagnetic wave with the wavelength range of 900-1700nm is emitted to the product to be classified, the time required for receiving all the electromagnetic waves is 228 milliseconds, and the frequency collected by the spectrum collecting device is 1 per millisecond, so that 228 spectrum data are collected.
Step 120: the spectral data of the product to be classified are averagely divided into a plurality of groups of spectral feature data, the number of the spectral data contained in each group of spectral feature data is the same, and one group of spectral feature data corresponds to one time step.
In this step, the purpose of segmenting the spectral data is to generate time steps to learn the dependency between the time steps through a time-cycled neural network. The time step is a basis for determining the dependency relationship among the multiple sets of spectral feature data, and is equivalent to the time dimension of the spectral feature data, that is, after the spectral data are divided into the multiple sets of spectral feature data, the spectral feature data are increased by a one-dimensional time dimension compared with the dimension of the spectral data. The spectral data are averagely divided into a plurality of groups of spectral feature data, the number of the spectral data contained in each group of spectral feature data is the same, and each group of spectral feature data corresponds to a time step. The embodiment of the invention does not limit the number of parts for dividing the spectral data, and each part is a group of spectral characteristic data and corresponds to a time step. The number of the spectral data partitions may correspond to the number of the neurons in the classification model in the embodiment of the present invention, that is, the number of the spectral data partitions is equal to the number of the neurons in the classification model.
For example, the spectrum data is a matrix of 10 rows and 228 columns, which indicates that the number of products to be classified is one, each product to be classified contains 228 spectrum data, the spectrum data is divided into 12 parts on average according to the number of the spectrum data, 12 groups of spectrum feature data are obtained, each group of spectrum feature data contains 19 spectrum data, 12 time steps are generated according to the division sequence of the spectrum feature data, and the spectrum feature data have dependency relationship between each time step. In a specific implementation process, the collected spectral data are arranged in the order of wavelengths from small to large, and when the spectral data are divided, the spectral data are divided in the order of the spectral data, and the arrangement order is not changed.
Step 130: and inputting the multiple groups of spectral characteristic data into a classification model to determine the dependency relationship among the multiple groups of spectral characteristic data based on the time step, and obtaining the types of the products to be classified based on the dependency relationship.
In this step, the classification model is obtained by training the time-cycle neural network model with a plurality of sets of training data. The time cycle neural network is used for learning the dependency relationship among the multiple groups of spectral characteristic data, the dependency relationship among the multiple groups of spectral characteristic data corresponding to different products is different, and a one-to-one correspondence relationship exists between the dependency relationship and the product types, so that the product types are uniquely determined after the dependency relationship of the products is determined. The number of neurons receiving the spectral feature data by the time-cycle neural network is consistent with the number of groups of the spectral feature data, and the number of neurons receiving the spectral feature data by the time-cycle neural network is determined by dividing the spectral feature data into the number of parts of the spectral feature data in step 120, for example, the number of neurons receiving the spectral feature data by the time-cycle neural network is 12 by dividing the spectral data into 12 parts to obtain 12 groups of spectral feature data, and each neuron receives one group of the spectral feature data. In a specific implementation process, the time-loop neural network may be a Long-Short Term Memory neural network (LSTM), a bidirectional LSTM, a threshold cycling Unit neural network (GRU), a bidirectional threshold cycling Unit (Bi-GRU), or the like, which can learn the dependency relationship between the features.
The method and the device for classifying the products obtain multiple groups of spectral characteristic data of the products to be classified by segmenting the acquired spectral data of the products to be classified, and input the multiple groups of spectral characteristic data into the classification model to obtain the types of the products to be classified, wherein the classification model is obtained by training the time cycle neural network model through multiple groups of training data, so that the classification model comprises the dependency relationship among the multiple groups of spectral characteristic data and the product classification results corresponding to the multiple groups of spectral characteristic data with different dependency relationships, and the products to be classified can be accurately classified according to the classification model. In addition, compared with an algorithm which can only classify products according to unordered combination of a plurality of features through a common machine learning algorithm, the time cycle neural network used in the embodiment of the invention takes the dependency relationship among a plurality of groups of spectral feature data as the basis for classification, and each group of spectral feature data is mutually related and is more reliable compared with a single feature or a plurality of mutually independent features; in addition, when a general machine learning algorithm trains a classification model, the training effect is limited by the number of training samples and the feature dimension of training data, and a better training effect can be achieved only under the condition that the number of samples is close to the feature dimension.
Fig. 2 shows a flow chart of a product classification method according to a second embodiment of the invention, which method, as shown in fig. 2, comprises the following steps:
step 210: multiple sets of training data are acquired.
In this step, the plurality of sets of training data are obtained from the spectrum data of all sample products, wherein the types of the sample products are consistent with the corresponding types of the products to be classified, and in the specific implementation process, a plurality of products are selected from each type of products as the sample products. Each of the plurality of sets of training data includes: a plurality of sets of spectral feature data of the sample product and identification information for identifying a type of the sample product. And averagely dividing the spectral data of each sample product into a plurality of groups of spectral characteristic data, wherein the number of the spectral data contained in each group of spectral characteristic data is the same, so as to obtain a plurality of groups of spectral characteristic data of each sample product. The sequence of the spectral data is arranged from small to large according to the wavelength, when the spectral data is divided, the spectral data is divided according to the arrangement sequence of the spectral data, each group of spectral feature data obtained after the division corresponds to a time step, and the time steps are increased one by one according to the arrangement sequence of the spectral data contained in the spectral feature data. For example, 228 spectral data are arranged according to the wavelength from small to large, the spectral data are divided into 12 groups of spectral feature data according to the arrangement sequence, each group comprises 19 spectral data, the time step corresponding to the first group of spectral feature data is 1, the time step corresponding to the second group of spectral feature data is 2, and the spectral feature data are sequentially added until the time step corresponding to the last group of spectral feature data is 12. The category to which each sample product belongs is known, and each sample product is identified by using identification information, wherein the identification information of the sample products of the same category is the same, and the identification information of the sample products of different categories is different. The presentation manner of the identification information can be set manually by those skilled in the art when implementing the embodiment of the present invention, for example, an english alphabet representation is used for each sample product, or a numerical representation is used for each sample product. In an embodiment of the present invention, different types of sample products are identified by binary numbers, for example, if there are five types of products to be classified, the identification information of the five types of sample products are: 00001. 00010, 00100, 01000 and 10000. And taking the multiple groups of spectral feature data of each sample product and the identification information corresponding to the sample product as a group of training data, wherein the training data of all the sample products form multiple groups of training data.
Step 220: and training the Bi-GRU model of the bidirectional threshold circulation unit according to a plurality of groups of training data to obtain a classification model.
In this step, the Bi-GRU model includes at least one Bi-GRU layer for learning the dependency relationship between the sets of spectral feature data of each sample product and a full-link layer for outputting the classification result. In some embodiments, the Bi-GRU model comprises two Bi-GRU layers, the sets of spectral feature data of a number of sample products selected from all the sample products are input to a first Bi-GRU layer, the first Bi-GRU layer learns the dependency relationship between the sets of spectral feature data of each sample product, and outputs a first external state of each sample product. The number of sample products selected each time can be within a hardware tolerance range, and is artificially defined by a person skilled in the art when implementing the embodiment of the present invention, for example, the number of sample products that can be processed each time is determined by the size of the video memory to be 50, and then a plurality of sets of spectral feature data corresponding to any number of sample products between 1 and 50 can be selected each time and input into the first Bi-GRU layer. The number of the sample products selected and processed each time is less than or equal to the total number of the sample products, and one epoch is completed through multiple iterations.
And the first external state output by the first Bi-GRU layer is used as the input of the second Bi-GRU layer, and the second Bi-GRU layer continuously learns the dependency relationship among the multiple groups of spectral feature data of each sample product according to the first external state and outputs a second external state. The first external state and the second external state are both multidimensional arrays used for representing multiple groups of spectral characteristic data, and elements in the multidimensional arrays are used for representing the dependency relationship among the multiple groups of spectral characteristic data. The dependency between the plurality of sets of spectral feature data characterized by the second external state is a dependency learned on the basis of the first external state, and therefore the second external state is more representative of the dependency between the plurality of sets of spectral feature data.
The first Bi-GRU layer and the second Bi-GRU layer have the same structure and the same learning process for input data, and taking one of the Bi-GRU layers as an example, fig. 3 shows a structural schematic diagram of the Bi-GRU layer, and as shown in fig. 3, the Bi-GRU layer includes a forward GRU unit and a reverse GRU unit, the forward GRU unit and the reverse GRU unit are independent from each other, the forward GRU unit and the reverse GRU unit have the same structure and both have a memory function, and the difference between the forward GRU unit and the reverse GRU unit is only in the opposite dependence relationship. The forward GRU unit is used for learning a forward dependency relationship among the multiple groups of spectral feature data of each sample product so as to output a forward external state of each sample product. The dependency relationship between the multiple sets of spectral feature data learned by the sample product in the ascending order of time step is a forward dependency relationship, for example, the dependency relationship between the spectral feature data learning at time step 2 and the spectral feature data at time step 1, the dependency relationship between the spectral feature data learning at time step 3 and the spectral feature data at time step 2, and so on until the dependency relationship between the last set of spectral feature data learning and the last set of spectral feature data, and the dependency relationship between every two sets of spectral feature data is output as a forward external state.
The reverse GRU unit is used for learning the reverse dependency relationship among the multiple groups of spectral feature data of each sample product so as to output the reverse external state of each sample product. The dependency relationship between the plural sets of spectral feature data learned in the descending order of time steps for the sample product is an inverse dependency relationship, for example, the dependency relationship between the spectral feature data learning at time step 11 and the spectral feature data at time step 12, the dependency relationship between the spectral feature data learning at time step 10 and the spectral feature data at time step 11, and so on until the dependency relationship between the spectral feature data learning at time step 1 and the spectral feature data at time step 2, and the dependency relationship between every two sets of spectral feature data is output as an inverse external state.
And combining the forward external state output by the forward GRU unit and the reverse external state output by the reverse GRU unit to obtain a first external state of each sample product. The output processes of the forward external state and the reverse external state are only opposite in direction, and the output processes are the same. Taking the output process of the forward external state as an example, the output process of the forward external state is further explained.
Referring to fig. 3a, fig. 3a shows a schematic structural diagram of a forward GRU unit, where the forward GRU unit includes a reset gate and an update gate, and memory information of second spectral feature data is recorded in the forward GRU unit, and the memory information of the second spectral feature data can be regarded as dependency information between the second spectral feature data and a previous group of spectral feature data. The second spectral feature data and the input first spectral feature data are spectral feature data of the same sample product, and the first spectral feature data is a next set of spectral feature data of the second spectral feature data, namely, a time step of the first spectral feature data is delayed by one time step unit from a time step of the second spectral feature data. For example, if the time step corresponding to the spectral feature data of the first input is 10, that is, the set of spectral feature data is located in the tenth group during the segmentation, the second spectral feature data is a set of spectral feature data of time step 9. According to the first spectral characteristic data and the memory information of the second spectral characteristic data recorded by the forward GRU unit, gate control signals of the reset gate and the update gate are calculated according to the following formulas:
rt=σ(Wrxt+Urht-1)
zt=σ(Wzxt+Uzht-1)
wherein r istAnd ztDoor respectively showing reset doorsControl signal and gate control signal for updating gate, sigma representing sigmoid function, Wr、UrAnd Wz、UzRespectively representing the weight matrix of the reset gate and the weight matrix of the update gate, ht-1Memory information, x, representing second spectral characteristic datatRepresenting first spectral characteristic information.
And determining the retention amount of the memory information of the second spectral characteristic data according to the following formula according to the gating signal of the reset gate and the first spectral characteristic data:
wherein h ist' retention amount of memory information representing second spectral feature data, ht-1Memory information representing the second spectral feature data, W and U represent weight matrices for forward GRU units.
According to the retention ht' AND update Gate gating Signal ztDetermining the memory information of the first spectral feature data according to the following formula:
Figure BDA0002183321450000122
wherein h istMemory information representing first spectral data, ztGate signal, h, representing an update gatet-1Memory information representing second spectral feature data, ht' represents a retention amount of memory information of the second spectral feature data.
And the forward GRU unit circularly updates the memory information according to the mode until the memory information of the last group of spectral characteristic data is obtained, and outputs the memory information of the last group of spectral characteristic data as a forward external state. The reverse external state output by the reverse GRU unit is memory information of the first group of spectral feature data in the time step dimension. And combining the forward external state and the reverse external state to obtain a first output state of the first Bi-GRU layer.
The second Bi-GRU layer continues to learn the dependency relationship between the multiple sets of spectral feature data of each sample product according to the first external state, and outputs the second external state, the output process of the second external state is similar to the output process of the first external state, and only the input data is changed from the multiple sets of spectral feature data to the first external state, therefore, the output process of the second Bi-GRU layer is described with reference to fig. 3a, and is not repeated here.
And weighting the second external state through the full connection layer, wherein the number of the corresponding neurons in the full connection layer is the number of the types contained in all the sample products, each external state in the second external state corresponds to a preset weight, and after weighting, obtaining the weighting result of each sample product corresponding to each type. And performing classification maximization output on the weighting result through a softmax classifier to obtain an output result corresponding to each sample product. The softmax classifier uses a softmax function, the function is a normalized exponential function, the weighting result can be mapped to an interval of 0 to 1, the mapped value can be regarded as a probability value of each classification, and the category corresponding to the maximum value is the prediction category corresponding to the sample product. The specific calculation formula of the output result is as follows:
wherein o isiA second external state, s, representing the output of the second Bi-GRU layerθ(oi) The output result of the softmax classifier for the ith sample product is shown, θ represents the weight matrix of the full link layer, and k represents the class number of the sample product, for example, if the number of the sample product classes to be classified is 5, k is 5.
And calculating a loss function value according to the maximum value in the output result of each sample product and the identification information of the sample product, wherein a specific expression of the loss function can be selected by a person skilled in the art, and the embodiment of the invention does not limit the category of the loss function. In a specific embodiment, the loss function is a cross-entropy loss function containing an L2 regularization term, and the specific calculation formula is as follows:
Figure BDA0002183321450000132
wherein J represents a loss function value, m represents the number of input samples, q represents the number of neurons of the second Bi-GRU layer, i.e., the dimension of the second external state, and θljRepresents the weight between the l layer neuron of the second Bi-GRU layer and the j layer neuron of the full connection layer, and lambda represents a weight factor.
And updating the weight of the Bi-GRU model according to the loss function value until the loss function value set is minimum, and taking the Bi-GRU model with the minimum loss function value as a classification model. When updating the weight, the weight theta is corrected by back propagationljUntil the loss function value is minimal. And taking the weight value which minimizes the loss function and the corresponding Bi-GRU model structure as a classification model.
Step 230: spectral data of the products to be classified are obtained.
In this step, the number of the spectrum data of the product to be classified is consistent with the number of the spectrum data of each sample product.
Step 240: the spectral data of the product to be classified are averagely divided into a plurality of groups of spectral feature data, the number of the spectral data contained in each group of spectral feature data is the same, and one group of spectral feature data corresponds to one time step.
In this step, the division of the spectral data of the product to be classified is consistent with the division of the spectral data of the sample product in training the classification model, for example, the 228 spectral data included in each sample product are divided into 12 time steps, each time step includes 19 spectral data, and when the 228 spectral data of the product to be classified is divided, the same division is similarly performed into 12 time steps, each time step includes 19 spectral data.
Step 250: and inputting the multiple groups of spectral feature data of the products to be classified into the classification model, determining the dependency relationship among the multiple groups of spectral feature data based on the time step, and obtaining the types of the products to be classified based on the dependency relationship.
According to the embodiment of the invention, a classification model is obtained by training a plurality of groups of training data through the Bi-GRU model, and the Bi-GRU model can learn not only the forward dependence relationship of a plurality of groups of spectral characteristic data, but also the backward dependence relationship of a plurality of groups of spectral characteristic data, so that the classification model obtained by training according to the Bi-GRU model contains the dependence relationship in two directions between the spectral characteristic data corresponding to each type of sample product, and the products to be classified can be reliably classified.
Fig. 4 shows a functional block diagram of a product sorting apparatus according to a third embodiment of the present invention. As shown in fig. 4, the apparatus includes: the system comprises an acquisition module 410, a segmentation module 420 and an input module 430, wherein the acquisition module 410 is used for acquiring spectral data of a product to be classified. The segmentation module 420 is configured to averagely segment the spectral data of the product to be classified into a plurality of sets of spectral feature data, where the number of the spectral data included in each set of spectral feature data is the same, and a set of spectral feature data corresponds to a time step. The input module 430 is configured to input the multiple sets of spectral feature data into a classification model, so as to determine a dependency relationship between the multiple sets of spectral feature data based on the time step, and obtain a type of a product to be classified based on the dependency relationship, where the classification model is obtained by training a time-cycle neural network model with multiple sets of training data.
In an optional manner, the apparatus further comprises: a first obtaining module 440, configured to obtain multiple sets of training data, where each of the multiple sets of training data includes: a plurality of sets of spectral feature data of the sample product and identification information for identifying a type of the sample product. And the training module 450 is configured to train the Bi-directional threshold cycle unit Bi-GRU model according to the plurality of sets of training data to obtain a classification model.
In an alternative manner, the first acquiring module 440 is further configured to acquire spectral data of all sample products; averagely dividing the spectral data of each sample product into a plurality of groups of spectral characteristic data, wherein the number of the spectral data contained in each group of spectral characteristic data is the same, so as to obtain a plurality of groups of spectral characteristic data of each sample product; identifying each sample product to obtain identification information of each sample product, wherein the identification information of the sample products of the same kind is the same, and the identification information of the sample products of different kinds is different; and taking the multiple groups of spectral feature data and the identification information of a sample product as a group of training data to obtain the multiple groups of training data.
In an alternative approach, the Bi-GRU model contains two Bi-GRU layers for learning the dependency between sets of spectral data for each sample product and one full-link layer for outputting the classification results.
In an optional manner, the training module 450 is further configured to input the plurality of sets of spectral feature data of the sample product into the first Bi-GRU layer; learning the dependency relationship among the multiple groups of spectral feature data of each sample product through the first Bi-GRU layer to output a first external state of each sample product; continuously learning the dependency relationship among the multiple groups of spectral feature data of each sample product according to the first external state through a second Bi-GRU layer so as to output a second external state of each sample product; weighting the second external state through a full connection layer to obtain a weighting result of each sample product; performing classification maximization output on the weighting result through a normalization index function softmax classifier to obtain an output result corresponding to each sample product; calculating a loss function value according to the output result and the identification information; updating the weight of the Bi-GRU model according to the loss function value until the loss function value is minimum; and taking the Bi-GRU model with the minimum loss function value as the classification model.
In an alternative, the first Bi-GRU layer includes a forward GRU unit and a reverse GRU unit, the forward GRU unit and the reverse GRU unit being independent of each other, the training module 450 is further configured to: learning, by the forward GRU unit, a forward dependency relationship between the sets of spectral feature data of each sample product to output a forward external state of each sample product; learning, by the reverse GRU unit, an inverse dependency relationship between the sets of spectral feature data for each sample product to output a reverse external state for each sample product; and combining the forward external state and the reverse external state to obtain a first external state of each sample product.
In an alternative mode, the forward GRU unit includes a reset gate and an update gate, and the training module 450 is further configured to calculate gate control signals of the reset gate and the update gate according to memory information of first spectral feature data and second spectral feature data recorded by the forward GRU unit, where the first spectral feature data and the second spectral feature data are spectral feature data of the same sample product, and the first spectral feature data is a next set of spectral feature data of the second spectral feature data; determining the retention amount of the memory information of the second spectral feature data according to the gating signal of the reset gate and the first spectral feature data; determining memory information of the first spectral feature data according to the retention amount and a gating signal of the updating gate; and outputting the memory information of the last group of spectral characteristic data of each sample product as the forward external state.
In an alternative approach, the training module 450 is further configured to:
according to the first spectral characteristic data and the memory information of the second spectral characteristic data recorded by the GRU neuron, gate control signals of the reset gate and the update gate are calculated according to the following formula:
rt=σ(Wrxt+Urht-1)
zt=σ(Wzxt+Uzht-1)
wherein r istAnd ztRespectively representing the gate control signal of the reset gate and the gate control signal of the update gate, sigma representing a sigmoid function, Wr、UrAnd Wz、UzA weight matrix representing the reset gate and the weight matrix of the update gate, h, respectivelyt-1Memory information, x, representing the second spectral feature datatRepresenting spectral characteristic information of the first.
In an alternative approach, the training module 450 is further configured to:
determining the retention amount of the memory information of the second spectral characteristic data according to the following formula according to the gating signal of the reset gate and the first spectral characteristic data:
Figure BDA0002183321450000161
wherein h ist' retention amount of memory information representing the second spectral feature data, ht-1Memory information representing the second spectral feature data, W and U representing weight matrices for the forward GRU units.
The training module 450 is further configured to:
determining the memory information of the first spectral feature data according to the retention amount and the gating signal of the update gate and the following formula:
wherein h istMemory information, z, representing the first spectral datatGate signal, h, representing an update gatet-1Memory information, h, representing the second spectral feature datat' represents a retention amount of memory information of the second spectral feature data.
According to the embodiment of the invention, the acquired spectrum data of the product to be classified is segmented by the segmentation module 420 to obtain a plurality of groups of spectrum characteristic data of the product to be classified, the plurality of groups of spectrum characteristic data are input into the classification model through the input module 430 to obtain the type of the product to be classified, and the classification model is obtained by training the time cycle neural network model through a plurality of groups of training data, so that the classification model comprises the dependency relationship among the plurality of groups of spectrum characteristic data and the product classification results corresponding to the plurality of groups of spectrum characteristic data with different dependency relationships, and the product to be classified can be accurately classified according to the classification model.
An embodiment of the present invention provides a non-volatile computer storage medium, where at least one executable instruction is stored in the computer storage medium, and the computer executable instruction may execute an operation corresponding to a product classification method in any method embodiment described above.
Fig. 5 is a schematic structural diagram of a computing device according to a fourth embodiment of the present invention, and the specific embodiment of the present invention does not limit the specific implementation of the computing device.
As shown in fig. 5, the computing device may include: a processor (processor)502, a Communications Interface 504, a memory 506, and a communication bus 508.
Wherein: the processor 502, communication interface 504, and memory 506 communicate with one another via a communication bus 508. A communication interface 504 for communicating with network elements of other devices, such as clients or other servers. The processor 502, configured to execute the program 510, may specifically perform the relevant steps in the embodiments of the product classification method described above.
In particular, program 510 may include program code that includes computer operating instructions.
The processor 502 may be a central processing unit CPU, or an Application Specific Integrated Circuit (ASIC), or one or more Integrated circuits configured to implement an embodiment of the present invention. The computing device includes one or more processors, which may be the same type of processor, such as one or more CPUs; or may be different types of processors such as one or more CPUs and one or more ASICs.
And a memory 506 for storing a program 510. The memory 506 may comprise high-speed RAM memory, and may also include non-volatile memory (non-volatile memory), such as at least one disk memory.
The program 510 may be specifically configured to enable the processor 502 to execute steps 110 to 130 in fig. 1, steps 210 to 250 in fig. 2, and implement the functions of the modules 410 to 450 in fig. 4.
The algorithms or displays presented herein are not inherently related to any particular computer, virtual system, or other apparatus. Various general purpose systems may also be used with the teachings herein. The required structure for constructing such a system will be apparent from the description above. In addition, embodiments of the present invention are not directed to any particular programming language. It is appreciated that a variety of programming languages may be used to implement the teachings of the present invention as described herein, and any descriptions of specific languages are provided above to disclose the best mode of the invention.
In the description provided herein, numerous specific details are set forth. It is understood, however, that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the foregoing description of exemplary embodiments of the invention, various features of the embodiments of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the invention and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be interpreted as reflecting an intention that: that the invention as claimed requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
Those skilled in the art will appreciate that the modules in the device in an embodiment may be adaptively changed and disposed in one or more devices different from the embodiment. The modules or units or components of the embodiments may be combined into one module or unit or component, and furthermore they may be divided into a plurality of sub-modules or sub-units or sub-components. All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where at least some of such features and/or processes or elements are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments herein include some features included in other embodiments, rather than other features, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments. For example, in the following claims, any of the claimed embodiments may be used in any combination.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The usage of the words first, second and third, etcetera do not indicate any ordering. These words may be interpreted as names. The steps in the above embodiments should not be construed as limiting the order of execution unless specified otherwise.

Claims (13)

1. A method of product classification, the method comprising:
acquiring spectral data of a product to be classified;
averagely dividing the spectral data of the product to be classified into a plurality of groups of spectral feature data, wherein the number of the spectral data contained in each group of spectral feature data is the same, and one group of spectral feature data corresponds to one time step;
and inputting the multiple groups of spectral feature data into a classification model to determine the dependency relationship among the multiple groups of spectral feature data based on the time step, and obtaining the type of the product to be classified based on the dependency relationship, wherein the classification model is obtained by training a time cycle neural network model through multiple groups of training data.
2. The method of claim 1, wherein prior to acquiring spectral data of a product to be classified, the method further comprises:
obtaining a plurality of sets of training data, each of the plurality of sets of training data comprising: the method comprises the steps of obtaining a plurality of sets of spectral feature data of a sample product and identification information for identifying the type of the sample product;
and training the Bi-GRU model of the bidirectional threshold cycle unit according to the plurality of groups of training data to obtain the classification model.
3. The method of claim 2, wherein obtaining a plurality of sets of training data comprises:
acquiring spectral data of all sample products;
averagely dividing the spectral data of each sample product into a plurality of groups of spectral characteristic data, wherein the number of the spectral data contained in each group of spectral characteristic data is the same, so as to obtain a plurality of groups of spectral characteristic data of each sample product;
identifying each sample product to obtain identification information of each sample product, wherein the identification information of the sample products of the same kind is the same, and the identification information of the sample products of different kinds is different;
and taking the multiple groups of spectral feature data and the identification information of a sample product as a group of training data to obtain the multiple groups of training data.
4. The method of claim 2, wherein the Bi-GRU model comprises two Bi-GRU layers for learning dependencies between sets of spectral feature data for each sample product and one full-join layer for outputting classification results.
5. The method of claim 2, wherein training the Bi-GRU model according to the plurality of sets of training data to obtain a classification model comprises:
inputting a plurality of sets of spectral feature data of the sample product into a first Bi-GRU layer;
learning the dependency relationship among the multiple groups of spectral feature data of each sample product through the first Bi-GRU layer to output a first external state of each sample product;
continuously learning the dependency relationship among the multiple groups of spectral feature data of each sample product according to the first external state through a second Bi-GRU layer so as to output a second external state of each sample product;
weighting the second external state through a full connection layer to obtain a weighting result of each sample product;
performing classification maximization output on the weighting result through a normalization index function softmax classifier to obtain an output result corresponding to each sample product;
calculating a loss function value according to the output result and the identification information;
updating the weight of the Bi-GRU model according to the loss function value until the loss function value is minimum;
and taking the Bi-GRU model with the minimum loss function value as the classification model.
6. The method of claim 5, wherein the first Bi-GRU layer comprises forward GRU units and reverse GRU units, the forward GRU units and the reverse GRU units being independent of each other;
learning, by the first Bi-GRU layer, a dependency relationship between the plurality of sets of spectral feature data of each sample product to output a first external state of each sample product, including:
learning, by the forward GRU unit, a forward dependency relationship between the sets of spectral feature data of each sample product to output a forward external state of each sample product;
learning, by the reverse GRU unit, an inverse dependency relationship between the sets of spectral feature data for each sample product to output a reverse external state for each sample product;
and combining the forward external state and the reverse external state to obtain a first external state of each sample product.
7. The method of claim 6, wherein the forward GRU unit includes a reset gate and an update gate;
the learning, by the forward GRU unit, a forward dependency relationship between the plurality of sets of spectral feature data of each sample product, and outputting a forward external state of each sample product, includes:
according to first spectral characteristic data and memory information of second spectral characteristic data recorded by the forward GRU unit, gate control signals of the reset gate and the update gate are calculated, wherein the first spectral characteristic data and the second spectral characteristic data are spectral characteristic data of the same sample product, and the first spectral characteristic data is a next group of spectral characteristic data of the second spectral characteristic data;
determining the retention amount of the memory information of the second spectral feature data according to the gating signal of the reset gate and the first spectral feature data;
determining memory information of the first spectral feature data according to the retention amount and a gating signal of the updating gate;
and outputting the memory information of the last group of spectral characteristic data of each sample product as the forward external state.
8. The method of claim 7, wherein calculating gating signals for the reset gate and the update gate based on the first spectral signature data and the memory information of the second spectral signature data recorded by the forward GRU unit comprises:
according to the first spectral characteristic data and the memory information of the second spectral characteristic data recorded by the forward GRU unit, gate control signals of the reset gate and the updating gate are calculated according to the following formula:
rt=σ(Wrxt+Urht-1)
zt=σ(Wzxt+Uzht-1)
wherein r istAnd ztRespectively representing the gate control signal of the reset gate and the gate control signal of the update gate, sigma representing a sigmoid function, Wr、UrAnd Wz、UzA weight matrix representing the reset gate and the weight matrix of the update gate, h, respectivelyt-1Memory information, x, representing the second spectral feature datatThe spectral feature data representing the first.
9. The method of claim 8, wherein determining the retention amount of the memory information of the second spectral feature data according to the gate signal of the reset gate and the first spectral feature data comprises:
determining the retention amount of the memory information of the second spectral characteristic data according to the following formula according to the gating signal of the reset gate and the first spectral characteristic data:
Figure FDA0002183321440000041
wherein h ist' retention amount of memory information representing the second spectral feature data, ht-1Memory information representing the second spectral feature data, W and U representing weight matrices for the forward GRU units.
10. The method of claim 9, wherein said determining memory information for said first spectral feature data from said retained amount and a gating signal of said update gate comprises:
determining the memory information of the first spectral feature data according to the retention amount and the gating signal of the update gate and the following formula:
Figure FDA0002183321440000042
wherein h istMemory information, z, representing the first spectral datatGate signal, h, representing an update gatet-1Memory information, h, representing the second spectral feature datat' represents a retention amount of memory information of the second spectral feature data.
11. A product sorting apparatus, characterized in that the apparatus comprises:
the acquisition module is used for acquiring the spectral data of the products to be classified;
the segmentation module is used for averagely segmenting the spectral data of the product to be classified into a plurality of groups of spectral feature data, the number of the spectral data contained in each group of spectral feature data is the same, and one group of spectral feature data corresponds to one time step;
and the input module is used for inputting the multiple groups of spectral feature data into a classification model so as to determine the dependency relationship among the multiple groups of spectral feature data based on the time step and obtain the type of the product to be classified based on the dependency relationship, and the classification model is obtained by training a time cycle neural network model through multiple groups of training data.
12. A computing device, comprising: the system comprises a processor, a memory, a communication interface and a communication bus, wherein the processor, the memory and the communication interface complete mutual communication through the communication bus;
the memory is used for storing at least one executable instruction, and the executable instruction causes the processor to execute the corresponding operation of the product classification method according to any one of claims 1-10.
13. A computer storage medium having stored therein at least one executable instruction for causing a processor to perform operations corresponding to a method of product classification according to any one of claims 1-10.
CN201910804795.4A 2019-08-28 2019-08-28 Product classification method, device, computing equipment and computer storage medium Active CN110646350B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910804795.4A CN110646350B (en) 2019-08-28 2019-08-28 Product classification method, device, computing equipment and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910804795.4A CN110646350B (en) 2019-08-28 2019-08-28 Product classification method, device, computing equipment and computer storage medium

Publications (2)

Publication Number Publication Date
CN110646350A true CN110646350A (en) 2020-01-03
CN110646350B CN110646350B (en) 2023-06-02

Family

ID=68991029

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910804795.4A Active CN110646350B (en) 2019-08-28 2019-08-28 Product classification method, device, computing equipment and computer storage medium

Country Status (1)

Country Link
CN (1) CN110646350B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111579724A (en) * 2020-06-01 2020-08-25 中国标准化研究院 Rapid classification method and device for sensory sensitivity of tingling and peppery suprathreshold and application
US20230032011A1 (en) * 2021-07-29 2023-02-02 Panasonic Intellectual Property Management Co., Ltd. Forecast generating system and method thereof

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201156043Y (en) * 2008-02-26 2008-11-26 浙江大学 Non-destruction detector for synthetic quality of food
CN107169535A (en) * 2017-07-06 2017-09-15 谈宜勇 The deep learning sorting technique and device of biological multispectral image
CN107646089A (en) * 2015-03-06 2018-01-30 英国质谱公司 Spectrum analysis
CN108197751A (en) * 2018-01-23 2018-06-22 国网山东省电力公司电力科学研究院 Seq2seq network Short-Term Load Forecasting Methods based on multilayer Bi-GRU
CN108921285A (en) * 2018-06-22 2018-11-30 西安理工大学 Single-element classification method in sequence based on bidirectional valve controlled Recognition with Recurrent Neural Network
CN109002771A (en) * 2018-06-26 2018-12-14 中国科学院遥感与数字地球研究所 A kind of Classifying Method in Remote Sensing Image based on recurrent neural network
CN109063773A (en) * 2018-08-03 2018-12-21 华中科技大学 A method of laser microprobe nicety of grading is improved using characteristics of image
CN109632693A (en) * 2018-12-10 2019-04-16 昆明理工大学 A kind of tera-hertz spectra recognition methods based on BLSTM-RNN

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201156043Y (en) * 2008-02-26 2008-11-26 浙江大学 Non-destruction detector for synthetic quality of food
CN107646089A (en) * 2015-03-06 2018-01-30 英国质谱公司 Spectrum analysis
CN107169535A (en) * 2017-07-06 2017-09-15 谈宜勇 The deep learning sorting technique and device of biological multispectral image
CN108197751A (en) * 2018-01-23 2018-06-22 国网山东省电力公司电力科学研究院 Seq2seq network Short-Term Load Forecasting Methods based on multilayer Bi-GRU
CN108921285A (en) * 2018-06-22 2018-11-30 西安理工大学 Single-element classification method in sequence based on bidirectional valve controlled Recognition with Recurrent Neural Network
CN109002771A (en) * 2018-06-26 2018-12-14 中国科学院遥感与数字地球研究所 A kind of Classifying Method in Remote Sensing Image based on recurrent neural network
CN109063773A (en) * 2018-08-03 2018-12-21 华中科技大学 A method of laser microprobe nicety of grading is improved using characteristics of image
CN109632693A (en) * 2018-12-10 2019-04-16 昆明理工大学 A kind of tera-hertz spectra recognition methods based on BLSTM-RNN

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
李晟 等: "一种基于拉曼光谱的石油产品快速分类方法", 《光谱学与光谱分析》 *
王红 等: "民航突发事件领域本体关系提取方法的研究", 《计算机科学与探索》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111579724A (en) * 2020-06-01 2020-08-25 中国标准化研究院 Rapid classification method and device for sensory sensitivity of tingling and peppery suprathreshold and application
US20230032011A1 (en) * 2021-07-29 2023-02-02 Panasonic Intellectual Property Management Co., Ltd. Forecast generating system and method thereof

Also Published As

Publication number Publication date
CN110646350B (en) 2023-06-02

Similar Documents

Publication Publication Date Title
Khaing et al. Development of control system for fruit classification based on convolutional neural network
CN106897738B (en) A kind of pedestrian detection method based on semi-supervised learning
CN105224951B (en) A kind of vehicle type classification method and sorter
CN111832650B (en) Image classification method based on generation of antagonism network local aggregation coding semi-supervision
Marchant et al. Automated analysis of foraminifera fossil records by image classification using a convolutional neural network
Zhao et al. SEV‐Net: Residual network embedded with attention mechanism for plant disease severity detection
CN111881671B (en) Attribute word extraction method
CN110197205A (en) A kind of image-recognizing method of multiple features source residual error network
CN105631474B (en) Based on Jeffries-Matusita distance and class to the more classification methods of the high-spectral data of decision tree
Vaviya et al. Identification of artificially ripened fruits using machine learning
Qiao et al. Detection and classification of early decay on blueberry based on improved deep residual 3D convolutional neural network in hyperspectral images
CN110646350B (en) Product classification method, device, computing equipment and computer storage medium
CN110427912A (en) A kind of method for detecting human face and its relevant apparatus based on deep learning
CN116612386A (en) Pepper disease and pest identification method and system based on hierarchical detection double-task model
CN110738246B (en) Product classification method, device, computing equipment and computer storage medium
CN110163206B (en) License plate recognition method, system, storage medium and device
CN111709442A (en) Multilayer dictionary learning method for image classification task
CN113592008B (en) System, method, device and storage medium for classifying small sample images
CN116563850A (en) Multi-class target detection method and model training method and device thereof
Fawwaz et al. The Optimization of CNN Algorithm Using Transfer Learning for Marine Fauna Classification
CN116091763A (en) Apple leaf disease image semantic segmentation system, segmentation method, device and medium
Hou et al. FMRSS Net: Fast Matrix Representation‐Based Spectral‐Spatial Feature Learning Convolutional Neural Network for Hyperspectral Image Classification
Fitrianah et al. Fine-tuned mobilenetv2 and vgg16 algorithm for fish image classification
Karthik et al. GrapeLeafNet: A Dual-Track Feature Fusion Network with Inception-ResNet and Shuffle-Transformer for Accurate Grape Leaf Disease Identification
Sucharitha et al. A Study on the Performance of Deep Learning Models for Leaf Disease Detection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20200407

Address after: 1706, Fangda building, No. 011, Keji South 12th Road, high tech Zone, Yuehai street, Nanshan District, Shenzhen City, Guangdong Province

Applicant after: Shenzhen shuliantianxia Intelligent Technology Co.,Ltd.

Address before: 518000, building 10, building ten, building D, Shenzhen Institute of Aerospace Science and technology, 6 hi tech Southern District, Nanshan District, Shenzhen, Guangdong 1003, China

Applicant before: SHENZHEN H & T HOME ONLINE NETWORK TECHNOLOGY Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant