US11568230B2 - Method and device for food risk traceability information classification, and computer readable storage medium - Google Patents
Method and device for food risk traceability information classification, and computer readable storage medium Download PDFInfo
- Publication number
- US11568230B2 US11568230B2 US17/009,800 US202017009800A US11568230B2 US 11568230 B2 US11568230 B2 US 11568230B2 US 202017009800 A US202017009800 A US 202017009800A US 11568230 B2 US11568230 B2 US 11568230B2
- Authority
- US
- United States
- Prior art keywords
- traceability information
- current
- food risk
- vectors
- obtaining
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
- 235000013305 food Nutrition 0.000 title claims abstract description 156
- 238000000034 method Methods 0.000 title claims abstract description 65
- 239000013598 vector Substances 0.000 claims abstract description 103
- 238000013528 artificial neural network Methods 0.000 claims abstract description 96
- 238000013135 deep learning Methods 0.000 claims abstract description 82
- 230000006870 function Effects 0.000 claims abstract description 32
- 230000004044 response Effects 0.000 claims abstract description 15
- 238000013473 artificial intelligence Methods 0.000 claims abstract description 11
- 239000011159 matrix material Substances 0.000 claims description 41
- 230000002457 bidirectional effect Effects 0.000 claims description 13
- 230000004913 activation Effects 0.000 claims description 11
- 238000012545 processing Methods 0.000 claims description 11
- 230000000306 recurrent effect Effects 0.000 claims description 11
- 238000011478 gradient descent method Methods 0.000 claims description 10
- 238000004590 computer program Methods 0.000 claims description 5
- 238000010606 normalization Methods 0.000 claims description 5
- 230000008569 process Effects 0.000 description 12
- 238000012549 training Methods 0.000 description 12
- 230000007246 mechanism Effects 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 6
- 230000008713 feedback mechanism Effects 0.000 description 5
- 230000036541 health Effects 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 238000004519 manufacturing process Methods 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 238000011161 development Methods 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 241000283690 Bos taurus Species 0.000 description 2
- 208000019331 Foodborne disease Diseases 0.000 description 2
- 241001465754 Metazoa Species 0.000 description 2
- 244000144972 livestock Species 0.000 description 2
- 235000015277 pork Nutrition 0.000 description 2
- 235000013594 poultry meat Nutrition 0.000 description 2
- 230000000750 progressive effect Effects 0.000 description 2
- 206010016952 Food poisoning Diseases 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000001154 acute effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 239000000969 carrier Substances 0.000 description 1
- 230000001684 chronic effect Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000011109 contamination Methods 0.000 description 1
- 235000013365 dairy product Nutrition 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 235000012055 fruits and vegetables Nutrition 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 235000013622 meat product Nutrition 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 244000144977 poultry Species 0.000 description 1
- 230000000644 propagated effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 241000894007 species Species 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
- G06Q10/063—Operations research, analysis or management
- G06Q10/0635—Risk analysis of enterprise or organisation activities
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/06—Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons
- G06N3/063—Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons using electronic means
- G06N3/065—Analogue means
-
- G06N3/0635—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/906—Clustering; Classification
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
- G06F18/2148—Generating training patterns; Bootstrap methods, e.g. bagging or boosting characterised by the process organisation or structure, e.g. boosting cascade
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2413—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
- G06F18/24133—Distances to prototypes
-
- G06K9/6257—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/082—Learning methods modifying the architecture, e.g. adding, deleting or silencing nodes or connections
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/084—Backpropagation, e.g. using gradient descent
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/018—Certifying business or products
- G06Q30/0185—Product, service or business identity fraud
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/02—Agriculture; Fishing; Forestry; Mining
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/044—Recurrent networks, e.g. Hopfield networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/047—Probabilistic or stochastic networks
Definitions
- the present disclosure generally relates to food traceability technology, and particularly to a computer-implemented method for food risk traceability information classification, a device for food risk traceability information classification, and a non-transitory computer readable storage medium.
- Food safety is not only related to the health and the life safety of consumers, but also related to the healthy development of economy and the harmony and the stability of a society, so it is widely concerned by international society and domestic society.
- the consumers have more choices to food types and more knowledge about food safety, and are more concerned about their own health.
- the human health is also suffering more and more harm from the food, all kinds of accidents such as food poisoning, food borne diseases, and food contamination occur frequently, and more and more attention has been paid to the food safety.
- Food safety tracing refers to a fact that food producers, processors, and distributors record, save, and disclose the information that may affect the food quality and safety in process of food production and sales to the consumers, the information of the whole process of food supply can be reproduced after the food has been produced or circulated, to realize “the source can be traced, the flow direction can be traced, the process can be monitored, and the product can be recalled”, and for ensuring the food quality and safety.
- the traceability technology itself cannot solve food safety issues.
- Food risk information identification and monitoring and early warning of possible hazards in the whole food supply chain are goal and direction of common concern to food industry enterprises and governments of all countries.
- the food tracing is achieved based on the label carrier, that is, various types of information carriers such as paper labels, plastic labels, and electronic labels are used to collect individual unit or batch information that needs to be traced.
- GS1 global traceability standard set by GS1 (global standards 1) describes a process of using the label carrier methods to achieve tracing, and the GS1 has set technical standards for identification, information collection and exchange of traceability units.
- FIG. 1 is a flow chart of a computer-implemented method for food risk traceability information classification according to an embodiment of the present disclosure
- FIG. 2 is a block diagram of a device for food risk traceability information classification according to an embodiment of the present disclosure
- FIG. 3 is a flow chart of the computer-implemented method for food risk traceability information classification according to another embodiment of the present disclosure
- FIG. 4 is a flow chart of the computer-implemented method for food risk traceability information classification according to another embodiment of the present disclosure
- FIG. 5 is a block diagram of a computer equipment for the computer-implemented method for food risk traceability information classification according to an embodiment of the present disclosure.
- the method includes: building a deep learning neural networks model used for the food risk traceability information classification based on a self-learning ability of an artificial intelligence model, initializing weights and a bias of the deep learning neural networks model, and obtaining an original deep learning neural networks model; obtaining samples of food risk traceability information, dividing the samples of the food risk traceability information according to a format of at least one preset basic traceability information factor, and obtaining factors of the food risk traceability information, converting the factors of the food risk traceability information into vectors of the food risk traceability information, according to a preset vectorization method; inputting the vectors of the food risk traceability information into the original deep learning neural networks model, and obtaining original grading vectors of current food risk traceability information; and inputting the original classification vectors into a loss function, obtaining a loss rate of the original classification vectors, and determining the original classification vectors as
- FIG. 1 a flow chart of a computer-implemented method for food risk traceability information classification according to an embodiment of the present disclosure. As shown in FIG. 1 , the method includes the following steps:
- S 200 obtaining samples of food risk traceability information, dividing the samples of the food risk traceability information according to a format of at least one preset basic traceability information factor, and obtaining factors of the food risk traceability information;
- the deep learning neural networks model used for the food risk traceability information classification is built based on the self-learning ability of the artificial intelligence model, the weights and the bias of the deep learning neural networks model are initialized, and the original deep learning neural networks model is obtained.
- a random value within a preset range is set to a weight W in the deep learning neural networks model, and the factors of the food risk traceability information in training set and the hidden matrix built before are inputted into the built deep learning neural networks model used for food risk traceability information classification.
- the samples of the food risk traceability information are obtained and divided according to the format of at least one preset basic traceability information factor, and the factors of the food risk traceability information are obtained.
- the information read from the web page is divided according to the formats of preset six basic traceability information factors of “person, event, time, place, object, and belonging”.
- the factors of the food risk traceability information are converted into the vectors of the food risk traceability information, according to the preset vectorization method.
- the information factors divided after reading are received and converted into the vectors by using a specific vectorization method (e.g. word2vec).
- Word2vec is a correlation model used to generate word vectors. These models are shallow and two-layer neural networks and used for training to reconstruct the word text of linguistics. The network is represented by words, and the input words in adjacent positions need to be guessed, the order of words is not important under the assumption of bag-of-words model in word2vec.
- the word2vec model can be used to map each word to a vector which can be used to represent the relationship between words, and the vector is the hidden layer of the neural networks.
- step S 400 the vectors of the food risk traceability information are inputted into the original deep learning neural networks model, and the original classification vectors of the current food risk traceability information are obtained.
- the original classification vectors are inputted into the loss function, and the loss rate of the original classification vectors are obtained.
- the loss rate is within a preset range
- the original classification vectors is determined as the target classification result.
- the parameters (weights) are modified according to the stochastic gradient descent method and back propagation, this operation is repeated until the result converges in a reasonable interval, so as to realize the backward feedback control mechanism of the system.
- the method further includes:
- the system output is affected by obtaining the hidden matrix left by the previous food risk traceability information as the input of this node, so as to realize the forward feedback control mechanism of the system.
- the parameters (weights) are modified according to the stochastic gradient descent method and back propagation, this operation is repeated until the result converges in a reasonable interval, so as to realize the backward feedback control mechanism of the system.
- the step of inputting the original classification vectors into the original deep learning neural networks model for training, adjusting the weighted values and the bias of the original deep learning neural networks model, and obtaining the target deep learning neural networks model includes:
- the information read from the web page is divided according to the forms of preset six basic traceability information factors “person, event, time, location, object, and belonging”.
- a hidden matrix is initialized and built as the input of the deep learning neural networks model, and the deep learning neural networks model outputs a corresponding hidden matrix as the input of the next node, so as to realize a forward feedback process (K process) of the deep learning neural networks model.
- K process forward feedback process
- the parameters Wr, Wz, and W in the deep learning neural networks model need to be initialized and trained.
- a random value within a preset range is set to the weight W, and the factors of the food risk traceability information in training set and the hidden matrix built before are inputted into the built deep learning neural networks model.
- the loss function is calculated, and the parameters Wr, Wz, W are adjusted gradually by the stochastic gradient descent method, so that the output y close to the standard output y r . This operation is repeated until the parameters are converged and the output y is within the specified range.
- the factors of the food risk traceability information divided after reading are received and converted into the vectors by the specific vectorization method (e.g. word2vec).
- the deep learning neural networks model includes an input layer, a hidden layer, and an output layer.
- the hidden layer includes a bidirectional gated recurrent neural network layer and a fully connected layer.
- the bidirectional gated recurrent neural network layer includes a hidden matrix, a reset gate, and an update gate.
- the step of inputting the original classification vectors into the original deep learning neural networks model for training, adjusting the weighted values and the bias of the original deep learning neural networks model, and obtaining the target deep learning neural networks model further includes:
- a hidden state h t-1 is given as a feedforward state passed down from the previous node.
- Two gated states of a reset gate r and a update gate z are obtained, by a hidden state h t-1 1 passed down from the previous node and an input X t of the current node, according to the following formulas:
- W r is the weight of the reset gate
- W z is the weight of the update gate
- W r and W z are initialized to random values within the range.
- ⁇ is a sigmoid activation function
- sigmoid activation function formula is as follows:
- Softmax is a normalized function
- W is a weight matrix, and is initialized to a random value within the range.
- the parameters (weights) are modified by using the stochastic gradient descent method and the back propagation, the operation is repeated until the result converges in a reasonable interval, so as to realize the backward feedback control mechanism of the system.
- the hidden node of the neural networks itself is used as the input of the next node, to realize the K control mechanism of the bidirectional feedback mechanism, and the whole traceability system is adjusted by previous information, that is, the forward feedback.
- the parameters of the traceability system are adjusted by comparing the vectors of predicted result with the vectors of correct result, the R feedback mechanism is realized.
- the K and R control mechanisms are used to adjust the parameters of the traceability system according to the F feedback mechanism, so as to achieve the classification purpose of the traceability system, and the bidirectional feedback mechanism of KFR is realized.
- the at least one preset basic traceability information factor includes one or more of a person factor, an event factor, a time factor, a place factor, an object factor, and a belonging factor.
- the preset vectorization method is specifically to use Word2vec.
- FIG. 2 a block diagram of a computer device for food risk traceability information classification according to an embodiment of the present disclosure. As shown in FIG. 2 , the device includes:
- an initialization module 100 used for building a deep learning neural networks model used for the food risk traceability information classification based on a self-learning ability of an artificial intelligence model, initializing weights and a bias of the deep learning neural networks model, and obtaining an original deep learning neural networks model;
- a format division module 200 used for obtaining samples of food risk traceability information, dividing the samples of the food risk traceability information according to a format of at least one preset basic traceability information factor, and obtaining factors of the food risk traceability information;
- a text vector module 300 used for converting the factors of the food risk traceability information into vectors of the food risk traceability information, according to a preset vectorization method
- a classification module 400 used for inputting the vectors of the food risk traceability information into the original deep learning neural networks model, and obtaining original classification vectors of current food risk traceability information;
- a determining module 500 used for inputting the original classification vectors into a loss function, obtaining a loss rate of the original classification vectors, determining the original classification vectors as a target classification result in response to the loss rate being within a preset range, and outputting and storing the target classification result in a non-transitory storage.
- the determining module 500 includes:
- a training sub module used for inputting the original classification vectors into the original deep learning neural networks model for training in response to the loss rate not being within the preset range, adjusting weights and a bias of the original deep learning neural networks model, and obtaining the target deep learning neural networks model;
- a determining sub module used for inputting the vectors of the food risk traceability information into the target deep learning neural networks model to perform a a normalization processing, and obtaining the target classification result.
- the training sub module includes:
- an error calculating unit used for calculating an error between the original classification vectors and a preset standard vector
- a step size calculating unit used for calculating a step size corresponding to the error by a gradient descent method
- a model updating unit used for updating weights and a bias of a current node according to the step size, and obtaining the target deep learning neural networks model.
- the deep learning neural networks model includes an input layer, a hidden layer, and an output layer.
- the hidden layer includes a bidirectional gated recurrent neural network layer and a fully connected layer.
- the bidirectional gated recurrent neural network layer includes a hidden matrix, a reset gate, and an update gate.
- the training sub module further includes:
- an input unit used for inputting vectors of the food risk traceability information inputted by the current node into the bidirectional gated recurrent neural network layer, and obtaining a weight of a current reset gate and a weight of a current update gate;
- a gate updating unit used for inputting the weight of the current reset gate, the weight of the current update gate, the vectors of the food risk traceability information inputted by the current node, and a hidden matrix of a previous node into a activation function, and obtaining parameters of the current reset gate and parameters of the current update gate;
- a reset unit used for calculating and obtaining a hidden matrix of the current node, by using the parameters of the current reset gate, the parameters of the current update gate, and the hidden matrix of the previous node;
- a model rebuilding unit used for building a target deep learning neural networks model of the current node, by using the hidden matrix of the current node and a weight of the current node.
- the at least one preset basic traceability information factor includes one or more of a person factor, an event factor, a time factor, a place factor, an object factor, and a belonging factor.
- the preset vectorization method is to use Word2vec.
- the description is relatively simple, and for the related parts, please refer to the part of the description of the method embodiments.
- FIG. 5 a block diagram of a computer equipment for the computer-implemented method for food risk traceability information classification according to an embodiment of the present disclosure.
- the above computer device 12 is embodied in a form of a general purpose computing device.
- Components of the computer device 12 may include, but not limited to: one or more processors or processing units 16 , a system memory 28 , and a bus 18 connecting different system components (including the system memory 28 and the processing unit 16 ).
- the bus 18 represents one or more of several bus structures, including a memory bus or a memory controller, a periphery bus, a graphical acceleration port, a processor or a local bus 18 using any bus structure in a plurality of different bus structures.
- these architectures include, but not limited to, an industry standard architecture (ISA) bus, a micro channel architecture (MAC) bus, an enhanced ISA bus, a video electronics standards association (VESA) local bus, and a peripheral component interconnect (PCI) bus.
- ISA industry standard architecture
- MAC micro channel architecture
- VESA video electronics standards association
- PCI peripheral component interconnect
- the computer device 12 typically includes a plurality of computer system readable media. These media may be any available medium accessible by the computer device 12 , including volatile and non-volatile media, removable and non-removable media.
- the system memory 28 may include a computer system readable medium in a volatile memory form, for example, a random access memory (RAM) 30 and/or a high-speed cache memory 32 .
- the computer device 12 may further include other removable/non-removable, volatile/non-volatile computer system storage media.
- the storage system 34 may be used to read and write non-removable, non-volatile magnetic media (commonly referred to as a “hard disk drive”).
- a magnetic disk drive for reading and writing a removable non-volatile magnetic disk (for example, a “floppy disk”)
- an optical disc drive for reading and writing a removable non-volatile optical disc (for example, a compact disc read only memory (CD-ROM), a digital video disc read only memory (DVD-ROM), or other optical media)
- each drive may be connected to the bus 18 through one or more data media interfaces.
- the memory may include at least one program product, and the program product has a group of (for example, at least one) program modules 42 . These program modules 42 are configured to execute functions of the embodiments of the present disclosure.
- a program/utility tool 40 having a group of (at least one) program module 42 may be stored, for example, in the memory.
- Such program module 42 includes, but not limited to: an operating system, one or more application programs, other program modules 42 and program data. Each or certain combination of these examples may include implementation of a network environment.
- the program module 42 generally performs functions and/or methods in the embodiments described in the present disclosure.
- the computer device 12 may also communicate with one or more external devices 14 (for example, a keyboard, a pointing device, a display device 24 , and the like), and may also communicate with one or more devices that enable a user to interact with the computer device 12 , and/or communicate with any device (for example, a network adapter, a modem, and the like) that enables the computer device 12 to communicate with one or more other computing devices. This communication may proceed through an input/output (I/O) interface 22 .
- the computer device 12 may also communicate with one or more networks (for example, a local area network (LAN), a wide area network (WAN) and/or a public network such as the Internet) through a network adapter 20 .
- networks for example, a local area network (LAN), a wide area network (WAN) and/or a public network such as the Internet
- the network adapter 20 communicates with other modules of the computer device 12 through the bus 18 .
- other hardware and/or software modules may be used in conjunction with the computer device 12 , including but not limited to: a microcode, a device driver, a redundancy processing unit, an external magnetic disk driving array, a RAID system, a magnetic tape drive, and a data backup storage system 34 , and the like.
- the processing unit 16 executes various function applications and data processing by executing programs stored in the system memory 28 , for example, implementing the computer-implemented method for food risk traceability information classification provided in the embodiments of the present disclosure.
- the following steps are performed: building a deep learning neural networks model used for the food risk traceability information classification based on a self-learning ability of the artificial intelligence model, initializing weights and a bias of the deep learning neural networks model, and obtaining an original deep learning neural networks model; obtaining samples of the food risk traceability information, dividing the samples of the food risk traceability information according to a format of at least one preset basic traceability information factor, and obtaining factors of the food risk traceability information; converting the factors of the food risk traceability information into vectors of the food risk traceability information, according to a preset vectorization method; inputting the vectors of the food risk traceability information into the original deep learning neural networks model, and obtaining original classification vectors of current food risk traceability information; and inputting the original classification vectors into a loss function, obtaining a loss rate of the original classification vectors, determining the original classification vectors as a target classification result in response to the loss rate being within a preset range in response
- the present disclosure provides a non-transitory computer-readable storage medium storing computer programs.
- the computer programs are executed by a processor, steps of the method for food risk traceability information classification provided by the above embodiments of the present disclosure are performed.
- the following steps are performed: building a deep learning neural networks model used for the food risk traceability information classification based on the self-learning ability of the artificial intelligence model, initializing weights and a bias of the deep learning neural networks model, and obtaining an original deep learning neural networks model; obtaining samples of the food risk traceability information, dividing the samples of the food risk traceability information according to a format of at least one preset basic traceability information factor, and obtaining factors of the food risk traceability information; converting the factors of the food risk traceability information into vectors of the food risk traceability information, according to a preset vectorization method; inputting the vectors of the food risk traceability information into the original deep learning neural networks model, and obtaining original classification vectors of current food risk traceability information; and inputting the original classification vectors into a loss function, obtaining a loss rate of the original classification vectors, determining the original classification vectors as a target classification result in response to the loss rate being within the preset range, and outputting and storing
- the non-transitory computer-readable storage medium may be any combination of one or more computer-readable media.
- the computer-readable medium may be a computer-readable signal medium or a computer-readable storage medium.
- the computer-readable storage medium may be, for example, but not limited to, an electric, magnetic, optical, electromagnetic, infrared, or semi-conductive system, apparatus, or device, or any combination thereof.
- the computer-readable storage medium includes: an electrical connection having one or more wires, a portable computer magnetic disk, a hard disk, a random access memory (RAM), a read only memory (ROM), an erasable programmable read only memory (EPROM) or flash memory, an optical fiber, a compact disc read only memory (CD-ROM), an optical storage device, a magnetic storage device, or any appropriate combination thereof.
- the computer readable storage medium may be any tangible medium including or storing a program, and the program may be used by or in combination with an instruction execution system, apparatus, or device.
- the computer-readable signal medium may include a data signal transmitted in a baseband or as part of a carrier, and stores computer readable program code.
- the propagated data signal may be in a plurality of forms, including, but not limited to, an electromagnetic signal, an optical signal, or any appropriate combination thereof.
- the computer-readable signal medium may alternatively be any computer readable medium other than the computer readable storage medium.
- the computer-readable medium may be configured to send, propagate, or transmit a program configured to be used by or in combination with an instruction execution system, apparatus, or device.
- the computer program code configured to execute the operations of the present application may be written by using one or more programming languages or a combination thereof.
- the programming languages include an object-oriented programming language such as Java, Smalltalk and C++, and also include a conventional procedural programming language such as “C” or similar programming languages.
- the program code may be completely executed on a user computer, partially executed on a user computer, executed as an independent software package, partially executed on a user computer and partially executed on a remote computer, or completely executed on a remote computer or server.
- the remote computer may be connected to a user computer through any type of network including a local area network (LAN) or a wide area network (WAN), or may be connected to an external computer (for example, through the Internet by using an Internet service provider).
- LAN local area network
- WAN wide area network
- an Internet service provider for example, AT&T, MCI, Sprint, EarthLink, MSN, GTE, etc.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- General Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Databases & Information Systems (AREA)
- Human Resources & Organizations (AREA)
- Computing Systems (AREA)
- Molecular Biology (AREA)
- Computational Linguistics (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Strategic Management (AREA)
- Economics (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Entrepreneurship & Innovation (AREA)
- Marketing (AREA)
- Bioinformatics & Computational Biology (AREA)
- General Business, Economics & Management (AREA)
- Tourism & Hospitality (AREA)
- Development Economics (AREA)
- Game Theory and Decision Science (AREA)
- Educational Administration (AREA)
- Finance (AREA)
- Operations Research (AREA)
- Quality & Reliability (AREA)
- Accounting & Taxation (AREA)
- Neurology (AREA)
- Agronomy & Crop Science (AREA)
Abstract
Description
Claims (19)
h t =zΘh t-1+(1−z)Θh
y t=soft max(W*h t)
h t =zΘh t-1+(1−z)Θh
y t=soft max(W*h t)
h t =zΘh t-1+(1−z)Θh
y t=soft max(W*h t)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010620361.1A CN111784159B (en) | 2020-07-01 | 2020-07-01 | Food risk traceability information grading method and device |
CN202010620361.1 | 2020-07-01 |
Publications (2)
Publication Number | Publication Date |
---|---|
US20220004859A1 US20220004859A1 (en) | 2022-01-06 |
US11568230B2 true US11568230B2 (en) | 2023-01-31 |
Family
ID=72761028
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/009,800 Active 2041-06-23 US11568230B2 (en) | 2020-07-01 | 2020-09-02 | Method and device for food risk traceability information classification, and computer readable storage medium |
Country Status (2)
Country | Link |
---|---|
US (1) | US11568230B2 (en) |
CN (1) | CN111784159B (en) |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111784159B (en) * | 2020-07-01 | 2024-02-02 | 深圳市检验检疫科学研究院 | Food risk traceability information grading method and device |
CN112598116A (en) * | 2020-12-22 | 2021-04-02 | 王槐林 | Pet appetite evaluation method, device, equipment and storage medium |
CN112836150A (en) * | 2021-02-03 | 2021-05-25 | 捷玛计算机信息技术(上海)股份有限公司 | Identification method, system, equipment and medium for tracing code of medicine |
CN113191415B (en) * | 2021-04-26 | 2021-10-15 | 南京市产品质量监督检验院 | Food information detection method and system based on big data |
CN113378383B (en) * | 2021-06-10 | 2024-02-27 | 北京工商大学 | Food supply chain hazard prediction method and device |
CN117252346B (en) * | 2023-11-15 | 2024-02-13 | 江西珉轩智能科技有限公司 | Material traceability system and method |
CN117709986B (en) * | 2024-02-05 | 2024-05-28 | 福建农业职业技术学院 | Agricultural product production date credible tracing method and system based on deep learning |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10210860B1 (en) * | 2018-07-27 | 2019-02-19 | Deepgram, Inc. | Augmented generalized deep learning with special vocabulary |
US20200225655A1 (en) * | 2016-05-09 | 2020-07-16 | Strong Force Iot Portfolio 2016, Llc | Methods, systems, kits and apparatuses for monitoring and managing industrial settings in an industrial internet of things data collection environment |
US11393082B2 (en) * | 2018-07-26 | 2022-07-19 | Walmart Apollo, Llc | System and method for produce detection and classification |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH05204885A (en) * | 1992-01-30 | 1993-08-13 | Fujitsu Ltd | Device and method for accelerating learning of neural network |
WO2005048143A1 (en) * | 2003-11-13 | 2005-05-26 | Swiss Reinsurance Company | Automated credit risk indexing system and method |
US7933847B2 (en) * | 2007-10-17 | 2011-04-26 | Microsoft Corporation | Limited-memory quasi-newton optimization algorithm for L1-regularized objectives |
CN103729459A (en) * | 2014-01-10 | 2014-04-16 | 北京邮电大学 | Method for establishing sentiment classification model |
CN103942671A (en) * | 2014-04-28 | 2014-07-23 | 深圳市检验检疫科学研究院 | Visual RFID (Radio Frequency Identification Devices) food logistics supply chain management and risk control method |
CN107341576A (en) * | 2017-07-14 | 2017-11-10 | 河北百斛环保科技有限公司 | A kind of visual air pollution of big data is traced to the source and trend estimate method |
CN108038544B (en) * | 2017-12-04 | 2020-11-13 | 华南师范大学 | Neural network deep learning method and system based on big data and deep learning |
CN108564364B (en) * | 2018-03-12 | 2019-06-14 | 重庆小富农康农业科技服务有限公司 | A kind of rural culture big data credit investigation system precisely drawn a portrait based on block chain technology and user |
CN110543884B (en) * | 2018-05-29 | 2022-04-12 | 国际关系学院 | Network attack organization tracing method based on image |
CN110796639B (en) * | 2019-09-30 | 2022-04-29 | 武汉科技大学 | Pinellia ternata quality grading method based on neural network |
CN111784159B (en) * | 2020-07-01 | 2024-02-02 | 深圳市检验检疫科学研究院 | Food risk traceability information grading method and device |
-
2020
- 2020-07-01 CN CN202010620361.1A patent/CN111784159B/en active Active
- 2020-09-02 US US17/009,800 patent/US11568230B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20200225655A1 (en) * | 2016-05-09 | 2020-07-16 | Strong Force Iot Portfolio 2016, Llc | Methods, systems, kits and apparatuses for monitoring and managing industrial settings in an industrial internet of things data collection environment |
US11393082B2 (en) * | 2018-07-26 | 2022-07-19 | Walmart Apollo, Llc | System and method for produce detection and classification |
US10210860B1 (en) * | 2018-07-27 | 2019-02-19 | Deepgram, Inc. | Augmented generalized deep learning with special vocabulary |
Non-Patent Citations (3)
Title |
---|
Ali et al., An Efficient Quality Inspection of Food Products Using Neural Network Classification, J. INtell.Syst.2020 (Year: 2020). * |
Wang et al., "An improved traceability system for food quality assurance and evaluation based on fuzzy classification and neural network", Food Control 79 (2017) 363-370 (Year: 2017). * |
Zhang et al., "Combining Convolution Neural Network and Bidirectional Gated Recurrent Unit for Sentence Semantic Classification", IEEE Access, vol. 6, 2018 (Year: 2018). * |
Also Published As
Publication number | Publication date |
---|---|
US20220004859A1 (en) | 2022-01-06 |
CN111784159A (en) | 2020-10-16 |
CN111784159B (en) | 2024-02-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11568230B2 (en) | Method and device for food risk traceability information classification, and computer readable storage medium | |
US10861456B2 (en) | Generating dialogue responses in end-to-end dialogue systems utilizing a context-dependent additive recurrent neural network | |
WO2021196920A1 (en) | Intelligent question answering method, apparatus and device, and computer-readable storage medium | |
WO2020107806A1 (en) | Recommendation method and device | |
Han et al. | Semi-supervised active learning for sound classification in hybrid learning environments | |
Mitchell et al. | Predicting regional climate change: living with uncertainty | |
Rao et al. | RETRACTED: A Hybrid Approach for Plant Leaf Disease Detection and Classification Using Digital Image Processing Methods | |
Lu et al. | Towards interpretable deep learning models for knowledge tracing | |
CN109543031A (en) | A kind of file classification method based on multitask confrontation study | |
CN110580341A (en) | False comment detection method and system based on semi-supervised learning model | |
US11263488B2 (en) | System and method for augmenting few-shot object classification with semantic information from multiple sources | |
Tendijck et al. | Modeling the extremes of bivariate mixture distributions with application to oceanographic data | |
Jia et al. | STCM-Net: A symmetrical one-stage network for temporal language localization in videos | |
Wu et al. | Machine translation of English speech: Comparison of multiple algorithms | |
Su et al. | Low‐Rank Deep Convolutional Neural Network for Multitask Learning | |
CN110427464A (en) | A kind of method and relevant apparatus of code vector generation | |
Wang et al. | Design of deep learning Mixed Language short Text Sentiment classification system based on CNN algorithm | |
Li et al. | Improved LSTM data analysis system for IoT-based smart classroom | |
CN109446518B (en) | Decoding method and decoder for language model | |
Yang | Student Classroom Behavior Detection based on Improved YOLOv7 | |
Li | Emotion analysis method of teaching evaluation texts based on deep learning in big data environment | |
KR102288151B1 (en) | Method, apparatus and system for providing nutritional information based on fecal image analysis | |
Nautiyal et al. | Kcc qa latent semantic representation using deep learning & hierarchical semantic cluster inferential framework | |
Wang et al. | [Retracted] Design of Sports Training Simulation System for Children Based on Improved Deep Neural Network | |
Chen | A hidden Markov optimization model for processing and recognition of English speech feature signals |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SHENZHEN CUSTOMS ANIMAL AND PLANT INSPECTION AND QUARANTINE TECHNOLOGY CENTER, CHINA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CAI, YINA;BAO, XIANYU;RUAN, ZHOUXI;AND OTHERS;REEL/FRAME:053667/0221 Effective date: 20200828 Owner name: SHENZHEN CUSTOMS INFORMATION CENTER, CHINA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CAI, YINA;BAO, XIANYU;RUAN, ZHOUXI;AND OTHERS;REEL/FRAME:053667/0221 Effective date: 20200828 Owner name: SHENZHEN ACADEMY OF INSPECTION AND QUARANTINE, CHINA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CAI, YINA;BAO, XIANYU;RUAN, ZHOUXI;AND OTHERS;REEL/FRAME:053667/0221 Effective date: 20200828 |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO SMALL (ORIGINAL EVENT CODE: SMAL); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |