CN109034837B - Multi-code tracing anti-counterfeiting method and system - Google Patents

Multi-code tracing anti-counterfeiting method and system Download PDF

Info

Publication number
CN109034837B
CN109034837B CN201810698368.8A CN201810698368A CN109034837B CN 109034837 B CN109034837 B CN 109034837B CN 201810698368 A CN201810698368 A CN 201810698368A CN 109034837 B CN109034837 B CN 109034837B
Authority
CN
China
Prior art keywords
product
identification code
information
character identification
poultry
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810698368.8A
Other languages
Chinese (zh)
Other versions
CN109034837A (en
Inventor
李耀辉
任春庆
董云
张海英
李乐超
陈玉玲
丁莉
李荣佳
宋利民
程亚辉
尹成辉
赵敬震
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong Huaxia Weikang Agriculture Animal Husbandry Technology Co ltd
Original Assignee
Shandong Huaxia Weikang Agriculture Animal Husbandry Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong Huaxia Weikang Agriculture Animal Husbandry Technology Co ltd filed Critical Shandong Huaxia Weikang Agriculture Animal Husbandry Technology Co ltd
Priority to CN201810698368.8A priority Critical patent/CN109034837B/en
Publication of CN109034837A publication Critical patent/CN109034837A/en
Application granted granted Critical
Publication of CN109034837B publication Critical patent/CN109034837B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/018Certifying business or products
    • G06Q30/0185Product, service or business identity fraud
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1408Methods for optical code recognition the method being specifically adapted for the type of code
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks

Abstract

The application discloses a multi-code tracing anti-counterfeiting method and a system, wherein the method comprises the following steps: identifying to obtain first identification information based on the graphic identification code on the scanned product package; displaying a query interface under the condition that the first identification information is obtained through identification; obtaining a character identification code on the surface of a product to be queried, which is input by a user, based on a query interface; outputting product tracing information of the product to be inquired according to the character identification code on the surface of the product to be inquired; wherein, the product traceability information comprises: the method comprises the steps of inquiring production information of a product to be inquired in the whole production process, and a pre-configured figure identification code corresponding to a character identification code, wherein the pre-configured figure identification code corresponding to the character identification code is used for verifying the authenticity of the figure identification code on a product package. The product tracing method and the product tracing device have the advantages that the whole production process of the product is traced through the combination of the identification code on the product package and the identification code on the surface of the product, the purpose of tracing the source of the product is achieved, and the probability that the product is forged is reduced.

Description

Multi-code tracing anti-counterfeiting method and system
Technical Field
The application relates to the technical field of food safety, in particular to a multi-code tracing and anti-counterfeiting method and system.
Background
Food safety affects everyone's daily life and health. With the increasing living standard of people in recent years, people are not full but are safer and healthier to eat. The quality problem of meat and eggs in the recent exposed breeding industry is receiving great attention of society, opaqueness of meat and egg breeding and slaughtering processes causes public misunderstanding and misunderstanding of meat and eggs, and the food safety problem is undoubtedly the focus of attention. How to enable consumers to trace and trace the source of the purchased products and ensure the product quality, manufacturers mostly adopt various active tracing anti-counterfeiting measures.
In order to achieve the purpose of tracing and anti-counterfeiting, anti-counterfeiting trademarks, anti-counterfeiting bar codes, laser anti-counterfeiting labels and the like are mainly adopted in the prior art to help consumers to distinguish true from false, but with the improvement of the counterfeiting level of illegal merchants, the existing anti-counterfeiting measures have great potential safety hazards.
The above description is included in the technical recognition scope of the inventors, and does not necessarily constitute the prior art.
Disclosure of Invention
In order to solve the above problems, the present application provides a multi-code tracing anti-counterfeiting method, which includes the following steps: identifying to obtain first identification information based on the graphic identification code on the scanned product package; displaying a query interface under the condition that the first identification information is obtained through identification; the query interface is used for inputting a character identification code on the surface of any product in the product package by a user so as to query corresponding product traceability information; obtaining a character identification code on the surface of a product to be queried, which is input by a user, based on a query interface; outputting product tracing information of the product to be inquired according to the character identification code on the surface of the product to be inquired; wherein, the product traceability information comprises: the method comprises the steps of inquiring production information of a product to be inquired in the whole production process, and a pre-configured figure identification code corresponding to a character identification code, wherein the pre-configured figure identification code corresponding to the character identification code is used for verifying the authenticity of the figure identification code on a product package.
In one example, after the product tracing information of the product to be queried is output according to the character identification code on the surface of the product to be queried, the method further comprises the following steps: identifying to obtain third identification information based on scanning a pre-configured graphic identification code corresponding to the character identification code; and verifying the authenticity of the graphic identification code on the product package by comparing the third identification information with the first identification information.
In one example, the graphic identification code is in the form of any one of: bar code, two-dimensional code.
In one example, based on the query interface, the character identification code on the surface of the product to be queried, which is input by the user, is obtained, and the method comprises the following steps: detecting whether a shooting instruction is received or not based on a query interface, wherein the shooting instruction is an instruction for starting shooting of an image of the surface of a product to be queried; under the condition that a shooting instruction is received, acquiring an image of the surface of a product to be inquired, wherein the image of the surface of the product to be inquired contains a character identification code; extracting feature data of character identification codes contained in the image of the surface of the product to be inquired based on a pre-trained convolutional neural network identification model; and inputting the characteristic data into a recurrent neural network classifier, and sequentially outputting the identification result of the character identification code according to the characteristic data, the output data of the recurrent neural network classifier at the last moment and the vector data converted from the character identification code identified by the recurrent neural network classifier at the last moment on the basis of the recurrent neural network classifier.
In one example, the recursive neural network classifier employs a forward algorithm of:
Figure BDA0001714200550000021
Figure BDA0001714200550000022
wherein, b0=0;
Figure BDA0001714200550000023
Figure BDA0001714200550000024
Wherein D is the dimension of the input vector, H is the number of the neurons of the hidden layer, K is the number of the neurons of the output layer, x is the characteristic data extracted by the convolutional neural network,
Figure BDA0001714200550000031
for the input of hidden layer neurons in the recurrent neural network at the current time,
Figure BDA0001714200550000032
for the output of hidden layer neurons in the recurrent neural network at the current time, θ () is
Figure BDA0001714200550000033
To
Figure BDA0001714200550000034
A function of (a); w is aih、wh'hAre respectively as
Figure BDA0001714200550000035
The corresponding weight coefficient of the weight is,
Figure BDA0001714200550000036
inputting the neural network output layer at the current moment; w is ah'hWeights corresponding to neurons of the output layer,
Figure BDA0001714200550000037
The outputs of the recurrent neural network output layer neurons for the current time,
Figure BDA0001714200550000038
and the probability value represents the proportion of the output value of the corresponding neuron at the current moment relative to the sum of all the neuritis output values of the output layer.
In one example, the collected image of the surface of the product to be queried is preprocessed, wherein the preprocessing comprises at least one of the following steps: graying processing, denoising processing and correcting processing.
In one example, the collected image of the surface of the product to be queried is grayed by the following algorithm: i ═ 0.3B +0.59G + 0.11R; wherein, I is a gray scale value of each pixel, B is a component of each pixel in the original image in a B channel, G is a component of each pixel in the original image in a G channel, and R is a component of each pixel in the original image in an R channel.
In one example, the acquired image of the surface of the product to be queried, which contains noise, is subjected to domain average processing by the following algorithm to obtain a denoised image:
Figure BDA0001714200550000039
wherein, P is the coordinate of each adjacent pixel in the acquired field, Q is the number of adjacent pixels contained in the field, f (x, y) is the acquired image containing noise, and g (x, y) is the denoised image.
In one example, the acquired image of the surface of the product to be queried is subjected to Radon transformation along each preset inclination angle through the following algorithm, the sum of the absolute gradient values of the projection integral graph corresponding to each preset inclination angle is calculated, the inclination angle with the maximum accumulated value of the absolute gradient values is determined as the inclination angle of the original image, and the original image is corrected according to the determined inclination angle to obtain a corrected image:
Rφ(x')=∫f(x'cosφ-y'sinφ,x'sinφ-y'cosφ)dy';
Figure BDA00017142005500000310
wherein phi represents a predetermined tilt angle, Rφ() Indicating that the Radon transform is performed in the phi direction and f (x, y) is an image in which the acquired character identification code is tilted.
In one example, before extracting feature data of character identification codes contained in the image of the surface of the product to be queried based on a pre-trained convolutional neural network recognition model, the method further comprises: extracting features of eight directions of an image on the surface of a product to be inquired by adopting a filter; taking an image of the surface of a product to be inquired and an image extracted by filter characteristics as the input of a convolutional neural network identification model, wherein the convolutional neural network identification model is a neural network comprising two convolutional layers and one multi-convolutional layer; and determining the convolutional neural network model with the highest test identification accuracy as the convolutional neural network identification model for extracting the feature data containing the character identification code in the image of the surface of the product to be inquired.
In one example, the filter is a Gabor filter that extracts features according to the formula:
Figure BDA0001714200550000041
Figure BDA0001714200550000042
Figure BDA0001714200550000043
Figure BDA0001714200550000044
wherein (x, y) identifies the pixel location, M is the number of directions,
Figure BDA0001714200550000045
represents the direction, σ identifies the spatial scale factor.
In one example, the product tracing information includes at least one of: farm information, poultry house information, poultry growth information, poultry feed information, poultry vaccine information, and breeding environment information.
In one example, after setting a corresponding character identification code for each egg produced by the same batch of poultry based on the graphic identification code, the method further comprises: constructing a block chain network, wherein block nodes in the block chain network comprise at least one of the following; young poultry suppliers, feed suppliers, vaccine suppliers, poultry farms, egg distributors, consumers; and storing the product tracing information of each egg to the block chain of each block node in the block chain network based on the character identification code of each egg.
On the other hand, this application has still provided a system that anti-fake is traced to source to many codes, includes: the consumer terminal is used for scanning the graphic identification code on the product package and identifying to obtain first identification information; the cloud server is communicated with the consumer terminal and used for displaying a query interface under the condition that the consumer terminal identifies and obtains the first identification information, obtaining a character identification code on the surface of the product to be queried, which is input by a user, based on the query interface, and outputting product traceability information of the product to be queried according to the character identification code on the surface of the product to be queried; wherein, the product traceability information comprises: the method comprises the steps of inquiring production information of a product to be inquired in the whole production process, and a pre-configured figure identification code corresponding to a character identification code, wherein the pre-configured figure identification code corresponding to the character identification code is used for verifying the authenticity of the figure identification code on a product package.
In one example, the system further comprises: the first coding equipment is used for generating a graphic identification code of the product package and printing the graphic identification code on the corresponding product package; and the second coding device is used for generating a character identification code of each product in the product package and printing the identification code of each product on the corresponding product surface.
In one example, the system further comprises: the acquisition equipment is communicated with the cloud server to acquire product traceability information of each product; the cloud server stores the product tracing information of each product into the block chain network based on the character identification code of each product.
In one example, the product is an avian egg, wherein the system further comprises: and the farmer terminal is communicated with the cloud server and is used for remotely monitoring the breeding information of at least one breeding plant.
The multi-code tracing anti-counterfeiting method provided by the application can bring the following beneficial effects:
1. product traceability information of each product is traced through a graphic identification code arranged on the product package and a character identification code arranged on the surface of each product in the product package, so that the safety of the product is guaranteed;
2. the graphic identification on the product package is verified by comparing the graphic identification code returned according to the character identification code, so that the probability of counterfeiting the product is reduced;
3. the block chain is used for storing the source tracing information of each product, so that the uniqueness and the non-tamper property of the data can be ensured.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a schematic structural diagram of a multi-code traceability anti-counterfeiting system according to an embodiment of the present application;
FIG. 2 is a schematic view of an alternative farm provided in accordance with an embodiment of the present disclosure;
fig. 3 is a schematic view of an alternative egg package provided by an embodiment of the present application;
FIG. 4 is a schematic diagram of a query interface provided in an embodiment of the present application;
fig. 5 is a schematic flow chart of a multi-code traceability anti-counterfeiting method according to an embodiment of the present application.
Detailed Description
In order to more clearly explain the overall concept of the present application, the following detailed description is given by way of example in conjunction with the accompanying drawings.
It should be noted that the multi-code traceability anti-counterfeiting scheme provided by the application can be applied to, but is not limited to, various meat and egg livestock and poultry products, and the whole production process of the products is traced through the combination of the identification code on the product package and the identification code on the product surface, so that the inherent relation between the identification code on the product package and the identification code on the product surface is only known by a manufacturer of the products, and the probability of counterfeiting the products is greatly reduced.
In addition, it should be noted that the above-mentioned product can be any kind of poultry eggs (including but not limited to goose eggs, duck eggs, chicken eggs, quail eggs, etc.) as an alternative embodiment, and the following examples of the present application are described by taking the eggs as an example.
The embodiment of the application discloses a multi-code traceability anti-counterfeiting system, as shown in fig. 1, the system comprises: a cloud server 2, a farmer terminal 3, a consumer terminal 4 and at least one farm (fig. 1 shows n, respectively 1-1, 1-2, … shows 1-n), wherein each farm has a plurality of barns (only four barns are shown in fig. 1, namely, barns 1-1-1, 1-1-2, 1-1-3, 1-1-4 of the farm 1-1, barns 1-2-1, 1-2-2, 1-2-3, 1-2-4 of the farm 1-2, barns 1-n-1, 1-n-2, 1-n-3, 1-n-4 of the farm 1-n). The Internet of things terminals (namely the Internet of things terminal 5-1 of the aquaculture plant 1-1, the Internet of things terminal 5-2 of the aquaculture plant 1-2 and the Internet of things terminal 5-n of the aquaculture plant 1-n) which are communicated with the cloud server 2 are deployed in each aquaculture plant, so that aquaculture information collected by various sensors in each aquaculture plant can be sent to the cloud server 2. The cultivation information of each cultivation factory is managed through the cloud server, and cloud cultivation can be achieved.
Alternatively, each farm in communication with the cloud server may collect farming information within each farm via various collection devices. The acquisition equipment can be various sensors or input equipment so as to acquire product traceability information of each product; the cloud server stores the product tracing information of each egg into the block chain network based on the character identification code on the surface of each egg. Specifically, each chick seedling supplier, feed supplier, vaccine supplier, layer chicken breeding factory, egg seller and consumer can be used as a node of the blockchain network, the blockchain network is constructed, and the product traceability (including the whole supply chain from chick seedling, feed, vaccine injected by layers chicken, breeding of layers chicken, selling of eggs to the final consumer) information of each egg is stored in each node of the blockchain network based on the unique character identification code of each egg, so that the non-removable property of the data can be ensured.
Taking a breeding plant 1-1 as an example, fig. 2 is a schematic diagram of an optional breeding plant provided in the embodiment of the present application, and as shown in fig. 2, the icon 7-1-1, the icon 7-1-2, the icon 7-1-3, the icon 7-1-4, the icon 7-1-5, the icon 7-1-6, and the icon 7-1-7 are a temperature sensor, a humidity sensor, a wind speed sensor, a carbon dioxide sensor, an illumination sensor, an ammonia sensor, and a PM2.5 sensor, respectively. The temperature, the humidity, the wind speed, the carbon dioxide, the illumination, the ammonia gas, the PM2.5 and other cultivation environment information of each cultivation plant are collected through the sensors, and the collected information is sent to the cloud server 2 through the Internet of things terminal 5-1.
To achieve intelligent farming, each farm may deploy intelligent farming robots, e.g., intelligent farming robots 6-1 of farm 1-1. The intelligent breeding robot communicates with various sensors deployed in a breeding plant, so that the intelligent breeding robot intelligently manages the breeding plant according to information collected by the sensors, for example, when the illumination sensors 7-1-5 detect that the illumination in the breeding plant is insufficient, the intelligent breeding robot 6-1 can open a window of the breeding plant or open a special light supplement lamp (the light supplement lamp can be an LED bulb of a characteristic spectrum adopted by livestock breeding) to supplement light.
The consumer terminal 4 shown in fig. 1 refers to a terminal device used by a consumer user when purchasing a product, and may be, but is not limited to, various forms of mobile phones. When a purchasing user utilizes the consumer terminal 4 to scan a graphic identification code (for example, a two-dimensional code 8 on an egg packing box shown in fig. 3) on a product package, and when the consumer terminal 4 obtains first identification information based on scanning recognition, a cloud server in communication with the consumer terminal 4 displays a query interface through the consumer terminal, and when a consumer user inputs a character identification code (for example, a character 9 on the surface of an egg in the egg packing box shown in fig. 3) on the surface of any product to be queried in the product packing box through the query interface, since the traceability information of each product is stored on the cloud server 2, after the cloud server 2 receives the character identification code on the surface of the product input by the user through the query interface, the traceability information of the product can be queried according to the character identification code, wherein the traceability information includes production information of the product to be queried in the whole production process, and the pre-configured figure identification code corresponding to the character identification code is used for verifying the authenticity of the figure identification code on the product package according to the figure identification code returned by comparing the character identification code with the figure identification code on the product package.
Since the user experience may be affected by manually inputting the character identifier on the surface of the product, as another alternative, the query interface displayed on the consumer terminal 4 may also support an image recognition function in the case that the consumer terminal 4 obtains the first identification information based on the scanning recognition. As shown in fig. 4, that is, the consumer does not need to manually input the character identification code (for example, WC20170122) on the surface of the product, but clicks the photographing query, the consumer terminal 4 detects the photographing instruction, turns on the camera, and recognizes the character identification code on the surface of the egg by photographing the surface of the egg to which the character identification code "WC 20170122" is attached.
When a user shoots an image on the surface of a product, due to the influence of the surrounding environment, the shot image may have noise, and in addition, the shot angles are different, so that a certain inclination angle may exist in the character identifier in the shot image on the surface of the product, therefore, when the shot image is subjected to image recognition, the collected image on the surface of the product to be queried is firstly preprocessed, wherein the preprocessing includes at least one of the following steps: graying processing, denoising processing and correcting processing.
When recognizing a character image, firstly converting an acquired color image into a gray image, as an optional implementation, converting the acquired image into a gray image by the following algorithm:
I=0.3B+0.59G+0.11R;
wherein, I is a gray value of each pixel, B is a component of each pixel in the original image in a B channel, G is a component of each pixel in the original image in a G channel, and R is a component of each pixel in the original image in an R channel;
further, in order to enhance the signal-to-noise ratio of the image, the acquired image of the surface of the product to be queried, which contains noise, is subjected to domain average processing by the following algorithm to obtain a denoised image:
Figure BDA0001714200550000091
wherein, P is the coordinate of each adjacent pixel in the field, Q is the number of adjacent pixels contained in the field, f (x, y) is the collected image containing noise, and g (x, y) is the denoised image;
in addition, when the acquired image of the surface of the product to be queried is corrected, the inclination angle of the character identification code in the image needs to be determined, and then the acquired image is corrected according to the determined inclination angle. As an optional implementation manner, the acquired image may be subjected to Radon transformation along each preset inclination angle through the following algorithm, the sum of absolute gradient values of the projection integral map corresponding to each preset inclination angle is calculated, and the inclination angle with the maximum accumulated value of the absolute gradient values is determined as the inclination angle of the original image:
Rφ(x')=∫f(x'cosφ-y'sinφ,x'sinφ-y'cosφ)dy';
Figure BDA0001714200550000092
wherein phi represents a predetermined tilt angle, Rφ() Indicating that the Radon transform is performed in the phi direction and f (x, y) is an image in which the acquired character identification code is tilted.
In order to realize quick recognition without segmenting characters in a character image, a Convolutional Neural Network (CNN) and a Recurrent Neural Network (RNN) are adopted to continuously recognize character sequences of the image containing the whole character identification code. Specifically, firstly, feature data of a character identification code contained in an image of the surface of a product to be inquired is extracted based on a pre-trained convolutional neural network recognition model, then the feature data is input into a recurrent neural network classifier, and based on the recurrent neural network classifier, a recognition result of the character identification code is sequentially output according to vector data converted from the feature data, output data of the recurrent neural network classifier at the previous moment and the character identification code recognized by the recurrent neural network classifier at the previous moment.
As an alternative implementation, the forward algorithm formula adopted by the recurrent neural network classifier may be:
Figure BDA0001714200550000101
Figure BDA0001714200550000102
wherein, b0=0;
Figure BDA0001714200550000103
Figure BDA0001714200550000104
Where D is the dimension of the input vector and H is hiddenThe number of neurons of the containing layer, K is the number of neurons of the output layer, x is the characteristic data extracted by the convolutional neural network,
Figure BDA0001714200550000105
for the input of hidden layer neurons in the recurrent neural network at the current time,
Figure BDA0001714200550000106
for the output of hidden layer neurons in the recurrent neural network at the current time, θ () is
Figure BDA0001714200550000107
To
Figure BDA0001714200550000108
A function of (a); w is aih、wh'hAre respectively as
Figure BDA0001714200550000109
Corresponding weight coefficients, at each moment w in the course of a forward algorithm transferih、wh'hAll are shared in a cross-time sequence mode, wherein the cross-time sequence sharing refers to that the recurrent neural network transmits signals in the forward direction at each moment wih、wh'hAre of the same value at different times
Figure BDA00017142005500001013
The values of the parameters are the same, so that the complexity of the model parameters is reduced, and overfitting caused by linear increase of the complexity of the model is avoided.
Figure BDA00017142005500001010
Inputting the neural network output layer at the current moment; w is ah'hThe weights corresponding to the neurons of the output layer,
Figure BDA00017142005500001011
the outputs of the recurrent neural network output layer neurons for the current time,
Figure BDA00017142005500001012
and the probability value represents the proportion of the output value of the corresponding neuron at the current moment relative to the sum of all the neuritis output values of the output layer.
From the above formula, it can be seen that the input data of the hidden layer neuron in the recurrent neural network used in the present application includes the training sample feature extracted by CNN, and the output data of the hidden layer of the recurrent neural network at the previous time. Therefore, when predicting the characters (words) at the current moment, the recurrent neural network used in the application not only depends on the characteristics of the image, but also depends on the characteristics (language model) output at the previous moment, and the recognition efficiency and accuracy are improved.
It should be noted that, before extracting feature data of character identification codes included in an image of a surface of a product to be queried based on a pre-trained convolutional neural network identification model, the convolutional neural network identification model needs to be trained. In order to improve the accuracy of the convolutional neural network recognition model, the convolutional neural network recognition model may be trained by the following method. Specifically, Gabor characteristics in eight directions of an image of the surface of a product to be queried are extracted by a Gabor filter, the image of the surface of the product to be queried and the image extracted by the Gabor filter characteristics are used as input of a convolutional neural network identification model, and finally the convolutional neural network model with the highest test identification accuracy is determined as the convolutional neural network identification model for extracting characteristic data containing character identification codes in the image of the surface of the product to be queried. It is readily noted that the convolutional neural network recognition model employed in the present application is a neural network that includes two convolutional layers and one multi-convolutional layer.
As an alternative implementation, the formula of the Gabor filter adopted in the present application is:
Figure BDA0001714200550000111
Figure BDA0001714200550000112
Figure BDA0001714200550000113
Figure BDA0001714200550000114
wherein (x, y) identifies the pixel location, M is the number of directions,
Figure BDA0001714200550000115
represents the direction, σ identifies the spatial scale factor.
As shown in fig. 2, the system may further include: a first coding device 10 and a second coding device 11, so as to generate a graphic identification code of a product package through the first scanning device 10 and print the graphic identification code on the corresponding product package; by the second coding device 11, a character identification code of each product in the product package is generated, and the identification code of each product is printed on the corresponding product surface.
Optionally, taking an egg as an example, the graphic identification codes (e.g., two-dimensional codes) for the same batch of chickens are the same, and different character identification codes are set on eggs produced by a certain batch of chickens, that is, the graphic identification codes of the same batch correspond to a plurality of character identification codes.
As an alternative embodiment, the system may further include: and the farmer terminal 3 is communicated with the cloud server and is used for remotely monitoring the breeding information of at least one breeding plant. Optionally, the farmer terminal includes, but is not limited to, any one of the following: computers, cell phones, tablet computers, notebook computers, and the like.
In addition, it should be noted that the cloud server 2 is used for communicating with the internet of things terminals of the aquaculture plants (e.g., the internet of things terminal 5-1 of the aquaculture plant 1-1, the internet of things terminal 5-2 of the aquaculture plant 1-2, and the internet of things terminal 5-n of the aquaculture plant 1-n) to provide cloud aquaculture management, namely, the growth information of the chicks in each henhouse and the breeding environment data in each henhouse are collected by various sensors arranged in each henhouse of each breeding plant, the Internet of things terminal deployed in each breeding plant is uploaded to the cloud server 2, so that a large amount of cloud breeding data are stored in the cloud server 2, and by analyzing the data, the chicken coop of each breeding plant can be scientifically managed, and performing data analysis on the growth, environment, feeding, vaccine injection and egg laying information of each batch of chickens.
In an optional implementation mode, the breeding experts can also monitor the breeding video data of each breeding plant through the cloud server 2 to obtain the whole breeding process of the farmers, visually analyze the breeding data, and provide breeding suggestions for the farmers, so that the breeding efficiency is improved.
Optionally, the cloud server 2 may employ artificial intelligence to provide an intelligent cloud farming management scheme through machine learning, including but not limited to any one of the following: (1) predicting cultivation data of the next period according to collected historical cultivation data (such as material data, growth conditions, temperature and humidity and other environmental data), preferably, providing differential adjustment for different regions and different climates; (2) predicting epidemic situations according to collected diagnosis data of each expert on the breeding plant, data asked by farmers and the like so as to remind the farmers to prevent in time; (3) by collecting a large amount of cultivation data and market data provided by the Internet, the most suitable cultivation mode or cultivation data is intelligently analyzed and predicted.
As shown in fig. 1, the breeding data of each breeding plant is uploaded to the cloud server through the internet of things terminal, so that the breeding data of each breeding plant is managed through the cloud server, and cloud breeding management is realized. In a cloud culture management system, in order to realize the tracing of the production process of each culture product, the application provides a multi-code tracing anti-counterfeiting method which can be applied to but not limited to the multi-code tracing anti-counterfeiting system shown in fig. 1. As shown in fig. 5, the method includes the steps of:
step S501, based on the graph identification code on the scanned product package, first identification information is obtained through identification.
Specifically, the graphic identification code on the product package may be, but is not limited to, a bar code, a two-dimensional code; the first identification information may be information characterizing a source of the product. The product can be any product requiring quality tracing, including but not limited to various meat and egg livestock products. Taking eggs produced by poultry (e.g., layers), the first identification information may be a poultry house code (e.g., a chicken house code) for characterizing from which poultry house of which breeding facility the eggs (e.g., eggs) come from.
Alternatively, the poultry may be any one of chicken, duck, goose, quail, etc., the poultry eggs may be any one of eggs, duck eggs, goose eggs, and quail eggs, and taking eggs as an example, after a consumer user scans a graphic identification code (e.g., a two-dimensional code) on an egg package through a terminal device such as a mobile phone, the consumer user may view a breeding area and a scale of a breeding plant, and reproduce a production process of a product.
Step S502, displaying a query interface under the condition that the first identification information is obtained through identification; the query interface is used for inputting the character identification code on the surface of any product in the product package by a user so as to query the corresponding product tracing information.
Specifically, after the identification information of the product source is obtained through the identification in step S501, a query interface may be output, where the query interface is used for a user to further query the traceability information of any product in the product package, so that the user may input a character identification code on the surface of a certain product to query the traceability information corresponding to the character identification code.
Step S504, based on the query interface, the character identification code on the surface of the product to be queried, which is input by the user, is obtained.
Specifically, a user can manually input the character identification code on the surface of any product in the product package into the query interface, and can click a 'click shooting query' button displayed on the query interface, and the character identification code on the surface of the product is automatically identified by shooting the image of the surface of the product containing the character identification code and adopting an image identification technology, so that the corresponding traceability information can be queried according to the character identification code.
Step S504, outputting product tracing information of the product to be inquired according to the character identification code on the surface of the product to be inquired; wherein, the product traceability information comprises: the method comprises the steps of inquiring production information of a product to be inquired in the whole production process, and a pre-configured figure identification code corresponding to a character identification code, wherein the pre-configured figure identification code corresponding to the character identification code is used for verifying the authenticity of the figure identification code on a product package.
Specifically, after a user inputs a character identification code on the surface of a certain product in a product package in a query interface through terminal equipment such as a mobile phone, the cloud server can return a corresponding query result according to the received character identification code and display the query result through the query interface displayed on the terminal equipment such as the mobile phone. It is easy to note that the query result corresponding to the character identification code not only includes the production information of the product in the whole production process, but also includes a figure identification code which is pre-configured by the manufacturer and corresponds to the character identification code, so that the authenticity of the figure identification code on the product package can be verified through the figure identification corresponding to the character identification code.
Optionally, still taking an egg as an example, after the consumer user inputs the character identification code on the egg through a terminal device such as a mobile phone, the consumer user can check data such as growth, environment, temperature, humidity, feeding and feeding, daily monitoring information and the like of the laying hen producing the egg. Preferably, the graphic identification code corresponding to the character identification code on the egg in the query result is identified, and the authenticity of the graphic identification code on the egg packaging box can be further verified according to the identification result so as to prevent the purchase of counterfeit products.
And step S505, identifying and obtaining third identification information based on scanning a pre-configured graphic identification code corresponding to the character identification code.
Specifically, the user may scan the graphic identification code corresponding to the character identification code, which is queried on the query interface, through a terminal device such as a mobile phone, so as to obtain the third identification information.
And S506, verifying the authenticity of the graphic identification code on the product package by comparing the third identification information with the first identification information.
Specifically, comparing the third identification information with the first identification information, if the third identification information is consistent with the first identification information, indicating that the graphic identification code of the product package is not forged; otherwise, if the two are not identical, the graphical identification code on the product package is forged.
As an optional embodiment, when the product to be queried is an egg, before outputting the product tracing information of the product to be queried according to the character identification code on the surface of the product to be queried, the method further includes: acquiring breeding plant information and poultry house information of the same batch of poultry; generating corresponding graphic identification codes according to the information of the breeding plants and the information of the poultry houses of the same batch of poultry; setting a corresponding character identification code for each egg produced by the same batch of poultry based on the graphic identification code; wherein each egg produced by the same poultry batch has a unique character identification code. Alternatively, the poultry may be a layer, the eggs may be eggs, and the poultry house may be a chicken house.
Optionally, the product tracing information includes at least one of: farm information, poultry house information, poultry growth information, poultry feed information, poultry vaccine information, and breeding environment information.
In order to improve the data security, as an alternative embodiment, after setting a corresponding character identification code for each egg produced by the same batch of poultry based on the graphic identification code, the method may further include the following steps: constructing a block chain network, wherein block nodes in the block chain network comprise at least one of the following; young poultry (e.g., chick) supplier, feed supplier, vaccine supplier, poultry farm, egg seller, consumer; and storing the product tracing information of each egg to the block chain of each block node in the block chain network based on the character identification code of each egg.
By storing the traceability information of each product through the block chain, the data can be ensured to be not tampered.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the system embodiment, since it is substantially similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
Those of skill would further appreciate that the various illustrative components and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative components and steps have been described above generally in terms of their functionality in order to clearly illustrate this interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (13)

1. A multi-code tracing anti-counterfeiting method is characterized by comprising the following steps:
identifying to obtain first identification information based on the graphic identification code on the scanned product package;
displaying a query interface under the condition that the first identification information is obtained through identification; the query interface is used for a user to input a character identification code on the surface of any product in the product package so as to query corresponding product traceability information;
obtaining a character identification code on the surface of the product to be queried, which is input by the user, based on the query interface;
outputting product tracing information of the product to be inquired according to the character identification code on the surface of the product to be inquired; wherein the product tracing information comprises: the product to be inquired comprises production information of the product in the whole production process and a pre-configured figure identification code corresponding to the character identification code, wherein the pre-configured figure identification code corresponding to the character identification code is used for verifying the authenticity of the figure identification code on the product package;
after the product tracing information of the product to be inquired is output according to the character identification code on the surface of the product to be inquired, the method further comprises the following steps:
identifying to obtain third identification information based on scanning the pre-configured graphic identification code corresponding to the character identification code;
verifying the authenticity of the graphic identification code on the product package by comparing the third identification information with the first identification information;
the product to be inquired is a poultry egg, and before the product tracing information of the product to be inquired is output according to the character identification code on the surface of the product to be inquired, the method further comprises the following steps:
acquiring breeding plant information and poultry house information of the same batch of poultry;
generating corresponding graphic identification codes according to the information of the breeding plants and the information of the poultry houses of the same batch of poultry;
setting a corresponding character identification code for each egg produced by the same batch of poultry based on the graphic identification code; wherein each egg produced by the same poultry has a unique character identification code.
2. The method of claim 1, wherein the product traceability information comprises at least one of: the system comprises breeding plant information, poultry house information, poultry growth information, poultry material information, poultry vaccine information and breeding environment information; wherein, after setting a corresponding character identification code for each egg produced by the same batch of poultry based on the graphic identification code, the method further comprises:
constructing a block chain network, wherein block nodes in the block chain network comprise at least one of the following; young poultry suppliers, feed suppliers, vaccine suppliers, poultry farms, egg distributors, consumers;
and storing the product tracing information of each egg to the block chain of each block node in the block chain network based on the character identification code of each egg.
3. The method of claim 1, wherein obtaining the character identification code on the surface of the product to be queried input by the user based on the query interface comprises:
detecting whether a shooting instruction is received or not based on the query interface, wherein the shooting instruction is an instruction for starting shooting of the image of the surface of the product to be queried;
under the condition that the shooting instruction is received, acquiring an image of the surface of the product to be inquired, wherein the image of the surface of the product to be inquired contains the character identification code;
extracting feature data of character identification codes contained in the image of the surface of the product to be inquired based on a pre-trained convolutional neural network identification model;
and inputting the characteristic data into a recurrent neural network classifier, and sequentially outputting the identification result of the character identification code according to the characteristic data, the output data of the recurrent neural network classifier at the last moment and the vector data converted from the character identification code identified by the recurrent neural network classifier at the last moment based on the recurrent neural network classifier.
4. The method of claim 3, wherein the recursive neural network classifier employs a forward algorithm of:
Figure FDA0002567603490000031
Figure FDA0002567603490000032
wherein, b0=0;
Figure FDA0002567603490000033
Figure FDA0002567603490000034
Wherein D is the dimension of the input vector, H is the number of the neurons of the hidden layer, K is the number of the neurons of the output layer, x is the characteristic data extracted by the convolutional neural network,
Figure FDA0002567603490000035
for the input of hidden layer neurons in the recurrent neural network at the current time,
Figure FDA0002567603490000036
for the output of hidden layer neurons in the recurrent neural network at the current time, θ () is
Figure FDA0002567603490000037
To
Figure FDA0002567603490000038
A function of (a); w is aih、wh'hAre respectively as
Figure FDA0002567603490000039
The corresponding weight coefficient of the weight is,
Figure FDA00025676034900000310
inputting the neural network output layer at the current moment; w is ah'hThe weights corresponding to the neurons of the output layer,
Figure FDA00025676034900000311
the outputs of the recurrent neural network output layer neurons for the current time,
Figure FDA00025676034900000312
and the probability value represents the proportion of the output value of the corresponding neuron at the current moment relative to the sum of all the neuritis output values of the output layer.
5. The method of claim 3, wherein the acquired image of the surface of the product to be queried is preprocessed, wherein the preprocessing comprises at least one of: graying processing, denoising processing and correcting processing.
6. The method according to claim 5, characterized in that the acquired image of the surface of the product to be queried is grayed by the following algorithm:
I=0.3B+0.59G+0.11R;
wherein, I is a gray scale value of each pixel, B is a component of each pixel in the original image in a B channel, G is a component of each pixel in the original image in a G channel, and R is a component of each pixel in the original image in an R channel.
7. The method as claimed in claim 5, wherein the acquired image of the surface of the product to be queried, which contains noise, is subjected to domain average processing by the following algorithm to obtain a denoised image:
Figure FDA0002567603490000041
wherein, P is the coordinate of each adjacent pixel in the acquired field, Q is the number of adjacent pixels contained in the field, f (x, y) is the acquired image containing noise, and g (x, y) is the denoised image.
8. The method of claim 5, wherein the acquired image of the surface of the product to be queried is subjected to Radon transformation along each preset inclination angle through the following algorithm, the sum of absolute gradient values of the projection integral map corresponding to each preset inclination angle is calculated, the inclination angle with the maximum accumulated value of the absolute gradient values is determined as the inclination angle of the original image, and the original image is corrected according to the determined inclination angle to obtain a corrected image:
Rφ(x')=∫f(x'cosφ-y'sinφ,x'sinφ-y'cosφ)dy';
Figure FDA0002567603490000042
wherein phi represents a predetermined tilt angle, Rφ() Indicating that the Radon transform is performed in the phi direction and f (x, y) is an image in which the acquired character identification code is tilted.
9. The method of claim 3, wherein before extracting feature data of character identification codes contained in the image of the surface of the product to be queried based on a pre-trained convolutional neural network recognition model, the method further comprises:
extracting features of the eight directions of the image on the surface of the product to be inquired by adopting a filter;
taking the image of the surface of the product to be inquired and the image extracted by the filter characteristics as the input of a convolutional neural network identification model, wherein the convolutional neural network identification model is a neural network comprising two convolutional layers and one multi-convolutional layer;
and determining the convolutional neural network model with the highest test identification accuracy as the convolutional neural network identification model for extracting the feature data containing the character identification code in the image of the surface of the product to be inquired.
10. The method of claim 9, wherein the filter is a Gabor filter that extracts features according to the formula:
Figure FDA0002567603490000051
Figure FDA0002567603490000052
Figure FDA0002567603490000053
Figure FDA0002567603490000054
wherein (x, y) identifies the pixel location, M is the number of directions,
Figure FDA0002567603490000055
represents the direction, σ identifies the spatial scale factor.
11. The system of anti-fake of polycode traceability, its characterized in that includes:
the consumer terminal is used for scanning the graphic identification code on the product package and identifying to obtain first identification information;
the cloud server is communicated with the consumer terminal and used for displaying a query interface under the condition that the consumer terminal identifies and obtains the first identification information, obtaining a character identification code on the surface of a product to be queried, which is input by a user, based on the query interface, and outputting product traceability information of the product to be queried according to the character identification code on the surface of the product to be queried;
wherein the product tracing information comprises: the product to be inquired comprises production information of the product in the whole production process and a pre-configured figure identification code corresponding to the character identification code, wherein the pre-configured figure identification code corresponding to the character identification code is used for verifying the authenticity of the figure identification code on the product package;
the product to be inquired is a poultry egg, and before outputting the product tracing information of the product to be inquired according to the character identification code on the surface of the product to be inquired, the method further comprises the following steps:
acquiring the information of the poultry farms in the same batch and the information of the poultry houses;
generating corresponding graphic identification codes according to the information of the breeding plants and the information of the poultry houses of the same batch of poultry;
setting a corresponding character identification code for each egg produced by the same batch of poultry based on the graphic identification code; wherein each egg produced by the same poultry has a unique character identification code.
12. The system of claim 11, further comprising:
the first coding equipment is used for generating a graphic identification code of a product package and printing the graphic identification code on the corresponding product package;
and the second coding device is used for generating a character identification code of each product in the product package and printing the identification code of each product on the corresponding product surface.
13. The system of claim 11, further comprising:
the acquisition equipment is communicated with the cloud server to acquire product traceability information of each product; the cloud server stores the product tracing information of each product into the block chain network based on the character identification code of each product.
CN201810698368.8A 2018-06-29 2018-06-29 Multi-code tracing anti-counterfeiting method and system Active CN109034837B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810698368.8A CN109034837B (en) 2018-06-29 2018-06-29 Multi-code tracing anti-counterfeiting method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810698368.8A CN109034837B (en) 2018-06-29 2018-06-29 Multi-code tracing anti-counterfeiting method and system

Publications (2)

Publication Number Publication Date
CN109034837A CN109034837A (en) 2018-12-18
CN109034837B true CN109034837B (en) 2020-12-29

Family

ID=65522014

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810698368.8A Active CN109034837B (en) 2018-06-29 2018-06-29 Multi-code tracing anti-counterfeiting method and system

Country Status (1)

Country Link
CN (1) CN109034837B (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111369261A (en) * 2018-12-24 2020-07-03 阿里巴巴集团控股有限公司 Product tracing method and system and product tracing information processing method
CN109784951A (en) * 2019-01-29 2019-05-21 浙江甲骨文超级码科技股份有限公司 A kind of anti-fake traceability system of product whole process based on block chain technology
CN110163629A (en) * 2019-04-09 2019-08-23 南京新立讯科技股份有限公司 A kind of commodity code of tracing to the source generates and querying method and device
WO2020223905A1 (en) * 2019-05-07 2020-11-12 林晖 Method for tracking product history
CN110490305B (en) * 2019-08-22 2021-06-04 腾讯科技(深圳)有限公司 Machine learning model processing method based on block chain network and node
CN110782265A (en) * 2019-11-12 2020-02-11 北京海益同展信息科技有限公司 Information processing method, device, system and computer readable storage medium
CN112288442A (en) * 2020-10-15 2021-01-29 江苏图码信息科技有限公司 Associated customized image code anti-counterfeiting tracing system and application component
CN112016535A (en) * 2020-10-26 2020-12-01 成都合能创越软件有限公司 Vehicle-mounted garbage traceability method and system based on edge calculation and block chain
CN112488109B (en) * 2020-12-10 2024-03-29 深圳市云辉牧联科技有限公司 Method and device for identifying livestock and poultry identification codes and computer readable storage medium
CN114241248B (en) * 2022-02-24 2022-07-01 北京市农林科学院信息技术研究中心 River crab origin tracing method and system
CN116132107B (en) * 2022-12-16 2024-04-12 苏州可米可酷食品有限公司 Full life cycle quality data traceability management system based on data cloud processing product
CN116911883B (en) * 2023-09-14 2023-12-19 新立讯科技股份有限公司 Agricultural product anti-counterfeiting tracing method and cloud platform based on AI (advanced technology) authentication technology and tracing quantification

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105654135A (en) * 2015-12-30 2016-06-08 成都数联铭品科技有限公司 Image character sequence recognition system based on recurrent neural network
CN105678292A (en) * 2015-12-30 2016-06-15 成都数联铭品科技有限公司 Complex optical text sequence identification system based on convolution and recurrent neural network
CN107330581A (en) * 2017-06-08 2017-11-07 上海交通大学 Agricultural product quality information system based on block chain

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103996239B (en) * 2014-06-13 2016-08-24 广州广电运通金融电子股份有限公司 A kind of bill positioning identifying method merged based on multi thread and system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105654135A (en) * 2015-12-30 2016-06-08 成都数联铭品科技有限公司 Image character sequence recognition system based on recurrent neural network
CN105678292A (en) * 2015-12-30 2016-06-15 成都数联铭品科技有限公司 Complex optical text sequence identification system based on convolution and recurrent neural network
CN107330581A (en) * 2017-06-08 2017-11-07 上海交通大学 Agricultural product quality information system based on block chain

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
"深度|区块链在天猫国际商品溯源中的应用";安和林(/users/mescnocly2te);《https://yq.aliyun.com/articles/348787?utm_content=m_40006》;20180111;正文第5页-第10页 *
安和林(/users/mescnocly2te)."深度|区块链在天猫国际商品溯源中的应用".《https://yq.aliyun.com/articles/348787?utm_content=m_40006》.2018, *

Also Published As

Publication number Publication date
CN109034837A (en) 2018-12-18

Similar Documents

Publication Publication Date Title
CN109034837B (en) Multi-code tracing anti-counterfeiting method and system
Font et al. Vineyard yield estimation based on the analysis of high resolution images obtained with artificial illumination at night
Diago et al. Grapevine yield and leaf area estimation using supervised classification methodology on RGB images taken under field conditions
CN107667903A (en) Livestock-raising live body Avoirdupois monitoring method based on Internet of Things
Mundada et al. Detection and classification of pests in greenhouse using image processing
Zhen et al. Impact of tree-oriented growth order in marker-controlled region growing for individual tree crown delineation using airborne laser scanner (ALS) data
Fernández et al. Multisensory system for fruit harvesting robots. Experimental testing in natural scenarios and with different kinds of crops
Teixidó et al. Definition of linear color models in the RGB vector color space to detect red peaches in orchard images taken under natural illumination
Fernández et al. Combination of RGB and multispectral imagery for discrimination of cabernet sauvignon grapevine elements
Arulmozhi et al. The application of cameras in precision pig farming: An overview for swine-keeping professionals
US20230071265A1 (en) Quantifying plant infestation by estimating the number of biological objects on leaves, by convolutional neural networks that use training images obtained by a semi-supervised approach
Roldán-Serrato et al. Automatic pest detection on bean and potato crops by applying neural classifiers
Gené-Mola et al. Assessing the performance of rgb-d sensors for 3d fruit crop canopy characterization under different operating and lighting conditions
Yousefi et al. A systematic literature review on the use of deep learning in precision livestock detection and localization using unmanned aerial vehicles
CN109843034A (en) Production forecast for wheatland
CN114898405B (en) Portable broiler chicken anomaly monitoring system based on edge calculation
Farjon et al. Deep-learning-based counting methods, datasets, and applications in agriculture: A review
Sheng et al. Rice growth stage classification via RF-based machine learning and image processing
Saradopoulos et al. Edge computing for vision-based, urban-insects traps in the context of smart cities
Dandekar et al. Weed Plant Detection from Agricultural Field Images using YOLOv3 Algorithm
Rybacki et al. Convolutional Neural Network (CNN) Model for the Classification of Varieties of Date Palm Fruits (Phoenix dactylifera L.)
JP2022114352A (en) Estimation system by artificial intelligence (ai), learning data generator, learning device, picking object estimation device, learning system, and program
Agbele et al. Application of local binary patterns and cascade AdaBoost classifier for mice behavioural patterns detection and analysis
Echalar et al. PaLife: a mobile application for palay (Rice) health condition classification utilizing image processing and pigment analysis towards sustainability of palay production
CN116977960A (en) Rice seedling row detection method based on example segmentation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant